do not remove logos in top left and right of every page
Size: 3.56 MB
Language: en
Added: Oct 27, 2025
Slides: 15 pages
Slide Content
ATHARVA EDUCATIONAL TRUST'S ATHARVA COLLEGE OF ENGINEERING (Approved by AICTE, Recognized by Government of Maharashtra & Affiliated to University of Mumbai - Estd . 1999 - 2000) ISO 21001:2018 ISO 14001:2015 ISO 9001:2015 NAAC Accredited A+ DEPARTMENT OF ELECTRONICS & TELECOMMUNICATION ENGINEERING 1 AUTONOMOUS VEHICLE NAVIGATION IN A SIMULATED SOFTWARE Project Members: Ashwin Sawant (31) Sannidhi Shetty (37) Vidhi Raut (26) Janmenjay Rashinkar (23) Project Guide Prof. Dhanashree Pannase Electronics & Telecommunication Engg by Atharva University
INDEX 2 PROBLEM STATEMENT INTRODUCTION BACKGROUND LITERATURE SURVEY PROPOSED SOLUTION FEASABILITY BLOCK DIAGRAM DESIGN OF SYSTEM FUTURE SCOPE CONCLUSION
INTRODUCTION This project focuses on developing a virtual self-driving car simulator that replicates real-world driving conditions for the safe testing and training of autonomous driving algorithms. 3
PROBLEM STATEMENT Real-world testing of autonomous vehicles is risky and expensive. Physical prototypes and hardware setups increase development costs. Limited real-world scenarios slow down AI training and testing. Lack of safe, repeatable environments hinders robust algorithm development. 4
BACKGROUND Originated from gaming to simulate realistic driving. Early games (e.g., Gran Turismo, NFS) showcased vehicle physics. Game engines repurposed for autonomous vehicle research. Tools like Unity enabled real-world scenario simulation. Simulators like CARLA , Udacity used for safe AI testing. Now essential for cost-effective, risk-free AV development. 5
L ITERATURE REVIEW Title of the Literature Work Journal Name of Publication Year Inference from the Work Shortcomings in the Presented Work A Novel Integrated Simulation and Testing Platform for Self‑Driving Cars With Hardware in the Loop IEEE Access 2020 Brain-inspired simulation enables modular, interpretable learning of self-driving control with less real-world data. The approach lacks real-world validation, making its effectiveness in dynamic, unpredictable driving environments uncertain. Test Your Self‑Driving Algorithm: An Overview of Publicly Available Driving Datasets and Virtual Testing Environments. IEEE Access 2019 The paper reviews public datasets and simulators for autonomous driving, guiding tool selection based on sensors, tasks, and features. The study is limited to existing resources and does not evaluate the performance or quality of the datasets and simulators, making it less actionable for practical implementation. 6
L ITERATURE REVIEW Title of the Literature Work Journal Name of Publication Year Inference from the Work Shortcomings in the Presented Work Sim-on-Wheels: Physical World in the Loop Simulation for Self-Driving. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2022 The paper proposes a hybrid simulation using sensor data with virtual control to improve safety, scalability, and efficiency in self-driving development. May face sync issues between real and simulated components; hardware needs could limit scalability. 7
P ROPOSED SOLUTION Simulate real-world driving scenarios. Include user-friendly GUI for interaction. 8 Use ML models (e.g., CNN) for object detection. CURRENT APPROCH NEW APPROCH Use ML models ( eg. CNN) for object detection. Display real-time status, objects, and path.
FEASABILITY Technical Feasibility Leverages Unity, Python, and open-source ML tools for efficient simulation and integration. Economic Feasibility Utilizes free tools and datasets, requiring only a mid-range computer. Time Feasibility Achievable within 7–9 months through modular, milestone-driven development. Operational Feasibility User-friendly and flexible, with potential for future expansion. 9
DESIGN OF SYSTEM 10 Environment & Input : Virtual road, traffic, and camera feed simulate real-world scenes. Perception : CNN detects lanes, vehicles, and pedestrians in real time. Decision-Making : Logic module decides driving actions based on perception. Control Module : Executes driving actions in the simulator. GUI Interface : Displays camera view, object detection, and car status. Feedback Loop : Simulator updates scene after each action, enabling learning
BLOCK DIAGRAM 11 TRAINING THE NEURAL NETWORK
FUTURE SCOPE Advanced Traffic Scenarios : Include roundabouts, multi-lane highways, and dynamic road rules. Sensor Fusion : Integrate simulated LiDAR, radar, and GPS for better environment understanding. Reinforcement Learning : Train models to make decisions in new, unpredictable scenarios. Cloud-Based Training : Use cloud platforms for large-scale simulation and faster model training. VR/AR Integration : Improve realism and interactivity with immersive virtual or augmented reality 12
CONCLUSION Demonstrates a safe, low-cost, and flexible platform for self-driving development. The project combines machine learning with simulation for effective testing and training. The GUI integration allows real-time visualization, interaction, and debugging . It provides a strong base for future upgrades like RL, real-world data, and complex traffic scenarios. The simulator moves closer to deploying real-world autonomous driving systems. 13
REFRENCES 1. “B‑GAP: Behavior‑Rich Simulation and Navigation for Autonomous Driving” Authors: Angelos Mavrogiannis, Rohan Chandra, Dinesh Manocha (2020) 2. “A machine learning environment for evaluating autonomous driving software” Authors: Jussi Hanhirova et al. (2020) 3. “Research on Autonomous Vehicle Lane‑Keeping and Navigation System Based on Deep Reinforcement Learning: From Simulation to Real‑World Application” Authors: Mandal et al. (2023, MDPI Electronics) 14