Control an LED on an ESP8266 microcontroller using UDP β from simple text commands to AI-powered hand gestures, drowsiness detection, and a multi-node ROS2 keyboard control system.
π Click here to join the WhatsApp Group
ROS_Workshop/
βββ p7/ # ESP8266 + Python programs
β βββ p7/p7.ino # Arduino: ESP8266 UDP LED server
β βββ udp_client.py # Simple text-based UDP client
β βββ camera_test.py # Basic OpenCV camera test
β βββ finger_udp.py # Finger counting β LED control (MediaPipe)
β βββ drowsiness_udp.py # Drowsiness detection β LED alert (dlib)
β βββ hand_landmarker.task # [DOWNLOAD] MediaPipe hand model
β βββ shape_predictor_68_face_landmarks.dat # [DOWNLOAD] dlib face model
β
βββ ros2_my_own_ws/ # ROS2 Workspace (single-node bridge)
β βββ src/udp_led_bridge/ # ROS2 package: UDP LED bridge
β βββ udp_led_bridge/
β βββ udp_sender_node.py # ROS2 node: topic β UDP β ESP8266
β
βββ ros2_keyboar_led_control/ # ROS2 Workspace (3-node keyboard system)
β βββ src/keyboard_node/ # ROS2 package: keyboard LED control
β βββ keyboard_node/
β βββ keyboard_input_node.py # Node 1: reads keystrokes (a/b/q)
β βββ udp_sender_node.py # Node 2: sends UDP to ESP8266
β βββ status_display_node.py # Node 3: displays LED status
β
βββ YoloExamples/ # YOLO object detection examples
β βββ object_detection.py # Real-time webcam object detection
β βββ annotate/ # Web-based annotation tool (Flask)
β β βββ app.py # Flask backend + API routes
β β βββ templates/ # Jinja2 + TailwindCSS pages
β βββ annotate_images.py # CLI annotation tool (OpenCV)
β βββ train_custom_model.py # Custom YOLO training pipeline
β βββ yolo_training_workflow.ipynb # Jupyter notebook: full training guide
β βββ yolov8n.pt # [AUTO-DOWNLOAD] YOLOv8 Nano model
β
βββ docs/ # Step-by-step documentation
β βββ 00_installation_and_get_started.md # Installation guide (Git, ROS2, Arduino, etc.)
β βββ 01_what_is_udp.md # UDP explained simply
β βββ 02_what_is_ros2.md # ROS2 concepts for beginners
β βββ 03_project_overview.md # Project structure & how things connect
β βββ 04_build_and_run.md # Full build & run instructions
β βββ 05_how_each_program_works.md # Deep dive into each program
β βββ 06_keyboard_led_control.md # 3-node keyboard system guide
β βββ 07_yolo_ultralytics.md # YOLO & Ultralytics deep dive
β βββ 08_annotating_and_training.md # Annotation, training & public datasets
β βββ Annotator Tool/ # Annotator tool screenshots
β βββ Whatsapp QR for Group/ # WhatsApp group QR code image
β
βββ pyproject.toml # Python dependencies (managed by uv)
βββ README.md # You are here!
Open p7/p7/p7.ino in Arduino IDE, select your ESP8266 board, and upload.
Open Serial Monitor (115200 baud) to see the ESP's IP address.
uv syncThese model files are too large for git. Download them into the p7/ directory:
MediaPipe Hand Landmarker (~12 MB) β needed by finger_udp.py:
wget -O p7/hand_landmarker.task \
https://storage.googleapis.com/mediapipe-models/hand_landmarker/hand_landmarker/float16/1/hand_landmarker.taskdlib Shape Predictor 68 (~99 MB) β needed by drowsiness_udp.py:
wget -O p7/shape_predictor_68_face_landmarks.dat.bz2 \
https://github.com/davisking/dlib-models/raw/master/shape_predictor_68_face_landmarks.dat.bz2
bunzip2 p7/shape_predictor_68_face_landmarks.dat.bz2YOLOv8 Nano (~6 MB) β needed by YoloExamples/object_detection.py:
cd YoloExamples
uv run python -c "from ultralytics import YOLO; YOLO('yolov8n.pt')"The model auto-downloads on first run too, but this pre-downloads it.
# Test camera
uv run python p7/camera_test.py
# Simple text control (type "on" / "off")
uv run python p7/udp_client.py
# Finger counting (2 fingers = ON, 3 = OFF)
uv run python p7/finger_udp.py
# Drowsiness detection (eyes closed = ON)
uv run python p7/drowsiness_udp.py
# YOLO real-time object detection (press q to quit, s to screenshot)
uv run python YoloExamples/object_detection.pycd ros2_my_own_ws
colcon build --packages-select udp_led_bridge
source install/setup.bash
# Terminal 1: Start the node
ros2 run udp_led_bridge udp_sender_node
# Terminal 2: Send commands
ros2 topic pub --once /led_command std_msgs/String "data: 'on'"
ros2 topic pub --once /led_command std_msgs/String "data: 'off'"cd ros2_keyboar_led_control
colcon build --packages-select keyboard_node
source install/setup.bashOpen 3 terminals (run cd ~/ROS_Workshop/ros2_keyboar_led_control && source install/setup.bash in each):
# Terminal 1 β Status display
ros2 run keyboard_node status_display_node
# Terminal 2 β UDP sender
ros2 run keyboard_node udp_sender_node
# Terminal 3 β Keyboard input (press a=ON, b=OFF, q=Quit)
ros2 run keyboard_node keyboard_input_nodeTo use a different ESP IP:
ros2 run keyboard_node udp_sender_node --ros-args -p esp_ip:="192.168.1.50"| Program | Method | What it does |
|---|---|---|
p7/p7/p7.ino |
Arduino | ESP8266 listens for UDP, controls LED |
p7/udp_client.py |
Terminal | Type "on"/"off" to control LED |
p7/camera_test.py |
Camera | Test webcam with OpenCV |
p7/finger_udp.py |
AI (MediaPipe) | Count fingers β control LED |
p7/drowsiness_udp.py |
AI (dlib) | Detect drowsiness β alert LED |
udp_sender_node.py (udp_led_bridge) |
ROS2 | Bridge ROS2 topic β UDP β ESP8266 |
keyboard_input_node.py |
ROS2 | Read keystrokes (a/b/q), publish commands |
udp_sender_node.py (keyboard_node) |
ROS2 | Forward commands via UDP, publish status |
status_display_node.py |
ROS2 | Display LED status updates in terminal |
YoloExamples/object_detection.py |
AI (YOLO) | Real-time object detection with webcam |
YoloExamples/annotate/app.py |
Tool (Flask) | Web-based image annotation (like Roboflow) |
YoloExamples/annotate_images.py |
Tool (CLI) | Command-line image annotation (OpenCV) |
YoloExamples/train_custom_model.py |
AI (YOLO) | Train/predict/export custom YOLO models |
YoloExamples/yolo_training_workflow.ipynb |
Notebook | Interactive Jupyter training guide |
Our browser-based annotation tool for labeling images in YOLO format β 100% local, no cloud needed.
uv run python YoloExamples/annotate/app.py
# Opens at http://localhost:5000Home Page β Select a folder of images or upload new ones:
Folder Browser β Navigate to your images, define classes, and start annotating:
Annotation Page β Draw bounding boxes, manage classes, track progress, save & export:
Supports multi-class labeling β draw boxes with different classes on the same image.
Each box becomes a line in the YOLO .txt label file. See the
Annotating & Training docs for full details.
- WiFi SSID: OnePlusRajath
- ESP8266 Default IP: 10.160.6.231
- UDP Port: 4210
- Both your PC and ESP8266 must be on the same WiFi network
ROS2 Jazzy is required for the ROS2 workspaces (ros2_my_own_ws and ros2_keyboar_led_control).
Python packages managed via pyproject.toml with uv:
- opencv-python / opencv-contrib-python β Camera and image processing
- mediapipe β Google's hand landmark detection
- dlib β Face landmark detection (68-point model)
- scipy β Distance calculations for EAR (Eye Aspect Ratio)
- ultralytics β YOLOv8 object detection
- flask β Web-based annotation tool
- jupyterlab β Jupyter notebooks for interactive training
- matplotlib β Plotting training results
- requests β HTTP library
See the docs/ folder for beginner-friendly explanations:
- Installation & Getting Started β Git, ROS2 Jazzy, Arduino, uv, Docker, Webots, etc.
- What is UDP? β UDP vs TCP, sockets, ports
- What is ROS2? β Nodes, Topics, Messages, Parameters
- Project Overview β How everything connects
- Build & Run β Step-by-step instructions + troubleshooting
- How Each Program Works β Deep dive with flowcharts
- Keyboard LED Control β 3-node ROS2 system guide
- YOLO & Ultralytics β Object detection deep dive, project ideas, embedded + image processing
- Annotating & Training β Public datasets, local annotation, training pipeline, complete walkthroughs
For educational/workshop use.


