Daniel Zheng

Beginner full-stack & Aspiring AI engineer

Hand Gesture LED Control System
  • July 2025 - Present

  • Relevant Skills: MediaPipe Hands, Arduino & Embedded C, Serial Communication, Gesture Recognition, Python (PySerial)

  • This project involved building a real-time hand gesture recognition system to control physical hardware using webcam input. Using MediaPipe Hands and OpenCV, I extracted 3D landmark positions for each finger, and designed a custom heuristic to estimate finger straightness based on joint geometry. These values were normalized and encoded as 8-bit signals ranging from 0 to 255. I transmitted these control values over a serial connection to an Arduino, which adjusted LED brightness in real time based on finger position. This project brought together computer vision, signal processing, and hardware integration, forming the foundation for a gesture-controlled prosthetics interface.

    A short demo of the hand tracking system controlling LED brightness.

ECE297 Project (TastiMap)

    Jan. 2025 - Present

  • Relevant Skills: C++, LibCurl, Google Places API, JNI, CMake

  • This project involved building an interactive city map application in C++, featuring pan/zoom, search, and dynamic route highlighting. I implemented efficient data structures and graph algorithms (Dijkstra’s, Travelling Sales Person heuristics, etc.) to process large real-world road networks, and integrated the Google Places API via LibCurl to display restaurant data with clickable pop-ups. I followed agile workflows, used Git for version control, and ensured quality through automated testing and debugging tools. I am currently adapting the project for Android with a Kotlin frontend and JNI bridges to native C++ code via CMake.

  • A short demo of the interactive city map application.

ECE243 Project (Heardle)

    Jan. 2025 - Apr. 2025

  • Relevant Skills: FPGA, C, PS/2 Keyboard, FIFO Buffers, Audio Codec, VGA Framebuffer

  • Built a music-guessing game in C for the DE1-SoC FPGA, using FIFO buffers and the audio codec to stream 16-bit clips. Parsed PS/2 keyboard input for guesses, and used the on-board system timer to seed random song selection. Drew pixel-based UI screens directly to the VGA framebuffer and implemented double-buffering for smooth visuals.

    Note: No public demo due to certain artists; this is strictly a technical showcase.

University of Toronto Robotics Workshop (Wireless-Controlled Robot)

    May 2024 - June 2024

  • Relevant Skills: Arduino, IR Sensors, H-Bridge Design, Fusion 360

  • Engineered a wireless robot from the ground up for competition. Designed a custom chassis in Fusion 360, built power-regulation circuits (H-Bridge, rectifiers), and wrote embedded C firmware (Arduino code) for path-tracing via IR sensors and joystick controls. Won first place in the wireless control challenge.

    A short demo of the robot`s IR path-tracing capabilities.

    A short demo of the robot`s Bluetooth control capabilities.