Section A: Project Information
This project aims to revolutionize marine STEM education in Hong Kong by providing affordable, modular underwater robotics kits that integrate artificial intelligence (AI) and immersive simulation. The initiative addresses the critical gap in accessible tools for hands-on marine science education, particularly in under-resourced schools, where high costs and rigid designs of commercial underwater robots limit student engagement and creativity.
Inspired by the MIT Sea Perch program, the project combines modular robotics with AI to create an interactive, student-driven learning platform. Key innovations include real-time data processing through AI algorithms, enabling students to analyze live sensor data and adapt their designs to dynamic environmental conditions. This approach not only enhances theoretical understanding but also fosters practical problem-solving skills, aligning with Bloom’s Taxonomy.
The functional architecture consists of a PS4 controller interfacing with a laptop running a Flask server, while an underwater setup includes a Raspberry Pi 5 for video streaming and an Arduino UNO for controlling thrusters and sensors. The integration of OpenCV for image recognition allows for real-time detection of marine species and pollution, while a chatbot powered by LM Studio provides instant support during missions.
The potential impact of this project is significant: it democratizes access to advanced marine robotics, promotes environmental stewardship through SDG 14 advocacy, and nurtures critical thinking and innovation among students. By linking classroom learning to real-world challenges, the platform empowers students to become active participants in marine conservation efforts. Pilot testing in Hong Kong classrooms will validate learning outcomes, with iterative refinements ensuring adaptability to diverse educational contexts, ultimately fostering a generation of environmentally conscious leaders.
Section B: Participant Information
Title | First Name | Last Name | Organisation/Institution | Faculty/Department/Unit | Phone Number | Current Study Programme | Current Year of Study | Contact Person / Team Leader | |
---|---|---|---|---|---|---|---|---|---|
Mr. | TSZ HIM | KWOK | The Education University of Hong Kong | Faculty of Liberal Arts and Social Sciences | s1148339@s.eduhk.hk | 63878200 | Bachelor's Programme | Year 4 |
|
Mr. | Hamza | Luqman | The Education University of Hong Kong | Faculty of Liberal Arts and Social Sciences | s1148389@s.eduhk.hk | 66070844 | Bachelor's Programme | Year 4 | |
Mr. | CHUN YU | WONG | The Education University of Hong Kong | Faculty of Humanities | s1143966@s.eduhk.hk | 67632569 | Bachelor's Programme | Year 3 | |
Mr. | Warren Ka Hin | Cheung | The Education University of Hong Kong | Faculty of Liberal Arts and Social Sciences | s1148326@s.eduhk.hk | 53995665 | Bachelor's Programme | Year 4 |
Section C: Project Details
Inspiration
Hong Kong’s STEM curricula lack accessible tools for immersive marine science education. While marine ecosystems are critical to sustainability (SDG 14), hands-on learning remains limited due to the high cost of commercial underwater robots (e.g., QYSEA) and their rigid, creativity-stifling designs. These barriers prevent students from experimenting with real-world problem-solving or customizing tools for scientific inquiry.
The MIT Sea Perch program, which pioneered low-cost ROV kits, inspired this project’s core vision: democratizing marine robotics education. However, Sea Perch’s lack of AI integration and real-time analytics limits its ability to connect classroom theory to dynamic environmental challenges. For instance, students cannot analyze live data streams or adapt their designs to evolving conditions, missing opportunities to apply STEM concepts in practical contexts. This gap drove the project’s focus on merging modular robotics with AI to create an interactive, student-driven learning platform.
Hypothesis and Success Factors
The hypothesis asserts that integrating AI with modular ROV design will bridge theoretical and practical learning, advancing students through Bloom’s Taxonomy—from memorizing concepts to analyzing data, designing solutions, and evaluating outcomes. For example, AI algorithms could process real-time sensor data , prompting students to troubleshoot hardware, interpret ecological trends, and propose conservation strategies.
Three factors ensure success:
Low-Cost Materials: Affordable components enable scalability across schools, particularly in resource-constrained settings.
Pedagogical Alignment: Structured around Bloom’s Taxonomy, the platform scaffolds learning from basic robotics principles to advanced AI-driven problem-solving.
SDG 14 Relevance: Linking marine robotics to ocean conservation fosters student engagement with global sustainability challenges, enhancing motivation and real-world impact.
By combining affordability, pedagogical rigor, and environmental relevance, this project addresses systemic gaps in STEM education while nurturing critical thinking and innovation. Pilot testing in Hong Kong classrooms will validate learning outcomes, with iterative refinements ensuring adaptability to diverse educational contexts.
NA
Functional Architecture:
Onshore: PS4 controller → Laptop (Flask server) → RJ45 cable.
Underwater: Raspberry Pi 5 (video streaming) → Arduino UNO → ESCs → Thruster motors.
AI Image Recognition: OpenCV processes live video feeds for species/pollution detection.
Chatbot Integration: LM Studio API answers student queries during missions.
Real-time Sensors Feedback : Sensors → Arduino UNO → Raspberry Pi 5 → Laptop
Video convert to 3D scene for simulation: Recorded video → 3D reconstruction → simulation environment.
Function Point Technical Application Progress
ROV Control PS4 controller + Raspberry Pi/Arduino Fully implemented
Image Recognition OpenCV + LM Studio Fully implemented
Chatbot Assistance LM Studio API Fully implemented
Real-time Sensors Feedback Receive Sensors Data Learning in progress
Video convert to 3D scene for simulation Computer Vision +Simulation Learning in progress
development timeline
Real-time Sensor Feedback:
Weeks 1–2: Research and procure sensors.
Functional sensor suite integrated with ROV.
Week 3-4: Hardware integration (sensor wiring to Arduino/Raspberry Pi).
Software calibration and data validation (Python scripts).
Calibrated sensor data displayed on laptop GUI.
3D Simulation Development:
Weeks 1–2: Develop video-to-3D conversion algorithm (OpenCV/SLAM).
Python script for 3D reconstruction.
Week 3-4: Integrate 3D meshes into Blender for texture refinement.
Sample 3D simulation of an underwater mission.
performance metrics
1. AI Detection Latency: Time for AI to analyze video and identify marine objects.
2. Thruster Control Response Time: Delay from controller input to thruster movement.
3. Sensor Response Time: Speed of sensor data transmission (depth, temperature) to the interface.
4. Critical for real-time interaction, precise control, and reliable data during missions.
This project redefines marine STEM education by merging modular robotics, AI-driven interactivity, and immersive simulation into a cohesive, student-centered platform. Traditional marine robotics tools, such as commercial ROVs or basic educational kits, often lock users into rigid frameworks that prioritize passive operation over creative problem-solving. In contrast, this initiative introduces three groundbreaking innovations to address these limitations:
AI-Enhanced Marine Science: Unlike static ROV kits, the integration of OpenCV enables real-time detection of marine species and pollution. Coupled with the LM Studio chatbot, which acts as a virtual mentor during missions, students receive instant guidance.
Modular, Student-Centric Design: Commercial ROVs restrict customization, but this project’s 3D-printed modular design empowers students to reconfigure thrusters, sensors, or payloads. This flexibility fosters creativity and iterative engineering—critical for nurturing adaptive problem-solving skills. For example, learners might redesign thruster placements to improve maneuverability in turbulent waters, applying physics principles to real-world challenges.
3D Simulation for Immersive Learning: Post-mission video is converted into interactive 3D scenes, allowing students to spatially analyze missions, and refine designs. This novel approach merges robotics with environmental science, enabling learners to visualize and address complex marine ecosystems.
The project’s sustainability-driven engagement further amplifies its impact: detecting plastic waste in live video feeds sparks student-led discussions on local ecosystems, transforming learners into advocates for SDG 14.
Scalability Strategies
To ensure scalability, the project employs modular that allow seamless replication across schools. Bottlenecks like hardware costs are mitigated through partnerships with China and local manufacturers for bulk procurement of 3D-printed parts and sensors, reducing unit costs. Training educators via train-the-trainer workshops and sharing curriculum templates on GitHub further accelerates adoption.
Sustainability & Long-Term Engagement
Environmental sustainability is embedded through energy-efficient components (low-power thrusters, rechargeable batteries) and recyclable 3D-printed materials (PLA bioplastic). The platform fosters engagement by linking AI-driven missions (e.g., pollution tracking) to SDG 14 advocacy, enabling students to export data to local conservation NGOs. To adapt to evolving needs, collecting data from users, improving species detection accuracy, while modular sensor ports allow hardware upgrades . Annual curriculum co-creation with educators ensures alignment with emerging STEM trends.
By prioritizing affordability, interoperability, and community-driven innovation, the solution scales impact while nurturing lifelong environmental stewardship.
Social Impact and Responsibility
This solution addresses systemic inequities in STEM education by providing affordable, hands-on marine robotics tools to Hong Kong schools, particularly those in under-resourced communities. By offering modular ROV kits at lower cost than commercial alternatives, the project democratizes access to advanced technologies, directly advancing equity and inclusion.
The platform enhances beneficiaries’ lives by bridging classroom learning to real-world environmental action. Students engage in missions detecting plastic pollution , fostering a sense of agency in addressing SDG 14 (Life Below Water). The 3D simulation feature, which reconstructs missions into interactive environments, supports neurodiverse learners through visual-spatial engagement, ensuring no student is excluded from immersive learning.
Metrics for Social Impact
Quantitative:
Adoption rate: Number of schools/students using the platform .
Cost accessibility: Percentage reduction in expenses compared to commercial ROVs.
Qualitative:
Student surveys tracking confidence in STEM and environmental advocacy.
Case studies highlighting career interest in marine science.
Responsiveness to Community Needs
Organize annual “Innovation Challenges” will invite students to propose hardware/software upgrades , embedding their ideas into future iterations. Partnerships with vocational institutes will ensure marginalized youth gain technical skills for green jobs, linking education to economic mobility.
By centering inclusivity, fostering civic responsibility, and institutionalizing community feedback, the solution empowers students to drive both educational and environmental progress.
Personal Information Collection Statement (PICS):
1. The personal data collected in this form will be used for activity-organizing, record keeping and reporting only. The collected personal data will be purged within 6 years after the event.
2. Please note that it is obligatory to provide the personal data required.
3. Your personal data collected will be kept by the LTTC and will not be transferred to outside parties.
4. You have the right to request access to and correction of information held by us about you. If you wish to access or correct your personal data, please contact our staff at lttc@eduhk.hk.
5. The University’s Privacy Policy Statement can be access at https://www.eduhk.hk/en/privacy-policy.
- I have read and agree to the competition rules and privacy policy.