Skip to content

analogdata/ROS_Workshop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

8 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

ROS Workshop β€” ESP8266 LED Control via UDP

Control an LED on an ESP8266 microcontroller using UDP β€” from simple text commands to AI-powered hand gestures, drowsiness detection, and a multi-node ROS2 keyboard control system.


Join the Workshop WhatsApp Group - Community Maintained

WhatsApp Group QR

πŸ‘‰ Click here to join the WhatsApp Group


Project Structure

ROS_Workshop/
β”œβ”€β”€ p7/                                  # ESP8266 + Python programs
β”‚   β”œβ”€β”€ p7/p7.ino                        # Arduino: ESP8266 UDP LED server
β”‚   β”œβ”€β”€ udp_client.py                    # Simple text-based UDP client
β”‚   β”œβ”€β”€ camera_test.py                   # Basic OpenCV camera test
β”‚   β”œβ”€β”€ finger_udp.py                    # Finger counting β†’ LED control (MediaPipe)
β”‚   β”œβ”€β”€ drowsiness_udp.py               # Drowsiness detection β†’ LED alert (dlib)
β”‚   β”œβ”€β”€ hand_landmarker.task             # [DOWNLOAD] MediaPipe hand model
β”‚   └── shape_predictor_68_face_landmarks.dat  # [DOWNLOAD] dlib face model
β”‚
β”œβ”€β”€ ros2_my_own_ws/                      # ROS2 Workspace (single-node bridge)
β”‚   └── src/udp_led_bridge/              # ROS2 package: UDP LED bridge
β”‚       └── udp_led_bridge/
β”‚           └── udp_sender_node.py       # ROS2 node: topic β†’ UDP β†’ ESP8266
β”‚
β”œβ”€β”€ ros2_keyboar_led_control/            # ROS2 Workspace (3-node keyboard system)
β”‚   └── src/keyboard_node/              # ROS2 package: keyboard LED control
β”‚       └── keyboard_node/
β”‚           β”œβ”€β”€ keyboard_input_node.py   # Node 1: reads keystrokes (a/b/q)
β”‚           β”œβ”€β”€ udp_sender_node.py       # Node 2: sends UDP to ESP8266
β”‚           └── status_display_node.py   # Node 3: displays LED status
β”‚
β”œβ”€β”€ YoloExamples/                        # YOLO object detection examples
β”‚   β”œβ”€β”€ object_detection.py            # Real-time webcam object detection
β”‚   β”œβ”€β”€ annotate/                      # Web-based annotation tool (Flask)
β”‚   β”‚   β”œβ”€β”€ app.py                   # Flask backend + API routes
β”‚   β”‚   └── templates/               # Jinja2 + TailwindCSS pages
β”‚   β”œβ”€β”€ annotate_images.py             # CLI annotation tool (OpenCV)
β”‚   β”œβ”€β”€ train_custom_model.py          # Custom YOLO training pipeline
β”‚   β”œβ”€β”€ yolo_training_workflow.ipynb   # Jupyter notebook: full training guide
β”‚   └── yolov8n.pt                     # [AUTO-DOWNLOAD] YOLOv8 Nano model
β”‚
β”œβ”€β”€ docs/                                # Step-by-step documentation
β”‚   β”œβ”€β”€ 00_installation_and_get_started.md  # Installation guide (Git, ROS2, Arduino, etc.)
β”‚   β”œβ”€β”€ 01_what_is_udp.md               # UDP explained simply
β”‚   β”œβ”€β”€ 02_what_is_ros2.md              # ROS2 concepts for beginners
β”‚   β”œβ”€β”€ 03_project_overview.md          # Project structure & how things connect
β”‚   β”œβ”€β”€ 04_build_and_run.md             # Full build & run instructions
β”‚   β”œβ”€β”€ 05_how_each_program_works.md    # Deep dive into each program
β”‚   β”œβ”€β”€ 06_keyboard_led_control.md      # 3-node keyboard system guide
β”‚   β”œβ”€β”€ 07_yolo_ultralytics.md          # YOLO & Ultralytics deep dive
β”‚   β”œβ”€β”€ 08_annotating_and_training.md   # Annotation, training & public datasets
β”‚   β”œβ”€β”€ Annotator Tool/                 # Annotator tool screenshots
β”‚   └── Whatsapp QR for Group/          # WhatsApp group QR code image
β”‚
β”œβ”€β”€ pyproject.toml                       # Python dependencies (managed by uv)
└── README.md                            # You are here!

Quick Start

1. Flash the ESP8266

Open p7/p7/p7.ino in Arduino IDE, select your ESP8266 board, and upload. Open Serial Monitor (115200 baud) to see the ESP's IP address.

2. Install Python Dependencies

uv sync

3. Download AI Model Files

These model files are too large for git. Download them into the p7/ directory:

MediaPipe Hand Landmarker (~12 MB) β€” needed by finger_udp.py:

wget -O p7/hand_landmarker.task \
  https://storage.googleapis.com/mediapipe-models/hand_landmarker/hand_landmarker/float16/1/hand_landmarker.task

dlib Shape Predictor 68 (~99 MB) β€” needed by drowsiness_udp.py:

wget -O p7/shape_predictor_68_face_landmarks.dat.bz2 \
  https://github.com/davisking/dlib-models/raw/master/shape_predictor_68_face_landmarks.dat.bz2
bunzip2 p7/shape_predictor_68_face_landmarks.dat.bz2

YOLOv8 Nano (~6 MB) β€” needed by YoloExamples/object_detection.py:

cd YoloExamples
uv run python -c "from ultralytics import YOLO; YOLO('yolov8n.pt')"

The model auto-downloads on first run too, but this pre-downloads it.

4. Run a Program

# Test camera
uv run python p7/camera_test.py

# Simple text control (type "on" / "off")
uv run python p7/udp_client.py

# Finger counting (2 fingers = ON, 3 = OFF)
uv run python p7/finger_udp.py

# Drowsiness detection (eyes closed = ON)
uv run python p7/drowsiness_udp.py

# YOLO real-time object detection (press q to quit, s to screenshot)
uv run python YoloExamples/object_detection.py

5. Run the ROS2 Single-Node Bridge

cd ros2_my_own_ws
colcon build --packages-select udp_led_bridge
source install/setup.bash

# Terminal 1: Start the node
ros2 run udp_led_bridge udp_sender_node

# Terminal 2: Send commands
ros2 topic pub --once /led_command std_msgs/String "data: 'on'"
ros2 topic pub --once /led_command std_msgs/String "data: 'off'"

6. Run the ROS2 Keyboard Control (3 Nodes)

cd ros2_keyboar_led_control
colcon build --packages-select keyboard_node
source install/setup.bash

Open 3 terminals (run cd ~/ROS_Workshop/ros2_keyboar_led_control && source install/setup.bash in each):

# Terminal 1 β€” Status display
ros2 run keyboard_node status_display_node

# Terminal 2 β€” UDP sender
ros2 run keyboard_node udp_sender_node

# Terminal 3 β€” Keyboard input (press a=ON, b=OFF, q=Quit)
ros2 run keyboard_node keyboard_input_node

To use a different ESP IP:

ros2 run keyboard_node udp_sender_node --ros-args -p esp_ip:="192.168.1.50"

Programs Overview

Program Method What it does
p7/p7/p7.ino Arduino ESP8266 listens for UDP, controls LED
p7/udp_client.py Terminal Type "on"/"off" to control LED
p7/camera_test.py Camera Test webcam with OpenCV
p7/finger_udp.py AI (MediaPipe) Count fingers β†’ control LED
p7/drowsiness_udp.py AI (dlib) Detect drowsiness β†’ alert LED
udp_sender_node.py (udp_led_bridge) ROS2 Bridge ROS2 topic β†’ UDP β†’ ESP8266
keyboard_input_node.py ROS2 Read keystrokes (a/b/q), publish commands
udp_sender_node.py (keyboard_node) ROS2 Forward commands via UDP, publish status
status_display_node.py ROS2 Display LED status updates in terminal
YoloExamples/object_detection.py AI (YOLO) Real-time object detection with webcam
YoloExamples/annotate/app.py Tool (Flask) Web-based image annotation (like Roboflow)
YoloExamples/annotate_images.py Tool (CLI) Command-line image annotation (OpenCV)
YoloExamples/train_custom_model.py AI (YOLO) Train/predict/export custom YOLO models
YoloExamples/yolo_training_workflow.ipynb Notebook Interactive Jupyter training guide

Annotator Tool (Web-Based Image Labeling)

Our browser-based annotation tool for labeling images in YOLO format β€” 100% local, no cloud needed.

uv run python YoloExamples/annotate/app.py
# Opens at http://localhost:5000

Home Page β€” Select a folder of images or upload new ones:

Annotator Home Page

Folder Browser β€” Navigate to your images, define classes, and start annotating:

Annotator Folder Browser

Annotation Page β€” Draw bounding boxes, manage classes, track progress, save & export:

Annotator Annotation Page

Supports multi-class labeling β€” draw boxes with different classes on the same image. Each box becomes a line in the YOLO .txt label file. See the Annotating & Training docs for full details.


Network Setup

  • WiFi SSID: OnePlusRajath
  • ESP8266 Default IP: 10.160.6.231
  • UDP Port: 4210
  • Both your PC and ESP8266 must be on the same WiFi network

Dependencies

ROS2 Jazzy is required for the ROS2 workspaces (ros2_my_own_ws and ros2_keyboar_led_control).

Python packages managed via pyproject.toml with uv:

  • opencv-python / opencv-contrib-python β€” Camera and image processing
  • mediapipe β€” Google's hand landmark detection
  • dlib β€” Face landmark detection (68-point model)
  • scipy β€” Distance calculations for EAR (Eye Aspect Ratio)
  • ultralytics β€” YOLOv8 object detection
  • flask β€” Web-based annotation tool
  • jupyterlab β€” Jupyter notebooks for interactive training
  • matplotlib β€” Plotting training results
  • requests β€” HTTP library

Documentation

See the docs/ folder for beginner-friendly explanations:

  1. Installation & Getting Started β€” Git, ROS2 Jazzy, Arduino, uv, Docker, Webots, etc.
  2. What is UDP? β€” UDP vs TCP, sockets, ports
  3. What is ROS2? β€” Nodes, Topics, Messages, Parameters
  4. Project Overview β€” How everything connects
  5. Build & Run β€” Step-by-step instructions + troubleshooting
  6. How Each Program Works β€” Deep dive with flowcharts
  7. Keyboard LED Control β€” 3-node ROS2 system guide
  8. YOLO & Ultralytics β€” Object detection deep dive, project ideas, embedded + image processing
  9. Annotating & Training β€” Public datasets, local annotation, training pipeline, complete walkthroughs

License

For educational/workshop use.

About

Getting Started with IoT Basics and ROS

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors