Mobile Robot Manufacturing and Driving Control Algorithm Development

Insightful Ink-walk
0

 

Introduction

In the rapidly evolving field of robotics and artificial intelligence, the development of autonomous mobile robots has gained significant attention. This blog post details an undergraduate capstone project focused on manufacturing a mobile robot and developing its driving control algorithm. The project, conducted from September 2019 to June 2020, aimed to create a functional mobile robot using Arduino, various sensors, and implement advanced control algorithms.




The primary objective of this study was to design and construct a mobile robot capable of autonomous navigation and to develop sophisticated driving control algorithms. This project not only involved hardware assembly but also incorporated software development using Arduino and Python programming languages.

Project Timeline and Methodology

The project was executed in several phases over the course of an academic year:

  1. Draft Plans (September 2019): Introduction of the task and initial planning.
  2. First Report & Assembly (October 2019): Compilation of required equipment list and robot assembly.
  3. First Semester Final Report (December 2019): Identification of shortcomings and initial coding for various experiments.
  4. Second Semester First Report (May 2020): Advanced planning and listing of additional equipment.
  5. Final Report (June 2020): Presentation of final progress, experiment results, and future planning.

This systematic approach ensured a comprehensive exploration of mobile robotics, from conceptualization to implementation and testing.

Theory and Principles

The project's theoretical foundation rests on several key components and principles:

Bluetooth Function Handler

A graphical user interface was developed using the Python Kivy platform. This interface allowed for remote command input to the robot through both Windows and Android systems. The implementation of Bluetooth functionality enhanced the robot's versatility, enabling wireless control and data transmission[1].

Motor Control

One of the challenges faced was Arduino's limitation in directly controlling two motors' speed and direction simultaneously. To overcome this, a motor shield was employed. This shield received control signals from the Arduino in both digital and analog formats. Digital signals were used for direction control, while analog PWM (Pulse Width Modulation) signals regulated speed. This setup allowed for precise control over the robot's movement.

Logic Gates

Various logic gates were implemented to enhance the robot's functionality. These gates played a crucial role in decision-making processes, allowing the robot to respond appropriately to different sensor inputs and environmental conditions.

Sensors

The project primarily utilized IR (Infrared) and Ultrasonic sensors. These sensors were integral to the robot's ability to perceive its environment. Simple algorithms were developed to interpret the sensor data, enabling the robot to make informed decisions about its movements and actions.

Assembly Process

The assembly of the mobile robot was a critical phase of the project. It involved integrating various components including:

  • Arduino microcontroller
  • Motor shield
  • DC motors
  • IR sensors
  • Ultrasonic sensor
  • Bluetooth module
  • Power supply
DIYRobotics


LineFollowingRobot


The assembly process required careful consideration of component placement to ensure optimal functionality and balance. Proper wiring and connections were essential to prevent short circuits and ensure smooth communication between different parts of the robot.

Experiments and Coding

Several experiments were conducted to develop and refine the robot's artificial intelligence capabilities. These experiments were crucial in testing the effectiveness of the implemented algorithms and identifying areas for improvement.


Line Tracing Algorithm

One of the key experiments involved developing a line tracing algorithm. This algorithm utilized IR sensors to detect and follow a line on the ground. The logic chart for line tracing illustrates the decision-making process based on inputs from left, center, and right IR sensors:


LEFT IR CENTER IR RIGHT IR LEFT MOTOR RIGHT MOTOR
X 0 X START(FULL) START(FULL)
0 0 X START(HALF) START(FULL)
X 0 0 START(FULL) START(HALF)
0 X X STOP START(FULL)
X X 0 START(FULL) STOP
0 0 0 STOP STOP

This chart demonstrates how the robot adjusts its motor speeds based on the detected line position, enabling smooth and accurate line following.

Face Detection with TensorFlow

Another significant experiment involved implementing face detection capabilities using TensorFlow. The process flow for this feature was as follows:

  1. Capture dynamic image
  2. Convert color image to grayscale
  3. Transform image into a matrix object
  4. Send matrix object to TensorFlow algorithm
  5. Check for face and eyes
  6. If detected, create a rectangle around it and display text
  7. Check distance with Ultrasonic sensor
  8. Move accordingly

This implementation showcased the integration of computer vision and machine learning techniques in mobile robotics.

Expected Results and Achievements



The project aimed to achieve several key functionalities:

  1. Autonomous Control: Implementation of self-direction finding logic for automatic control.
  2. Object Following: Ability to follow objects with speed adjustment using proportional control.
  3. Line Tracing: Successful line following using IR sensors.
  4. Remote Control: Manual control via remote, with potential for voice command integration.
  5. Obstacle Avoidance: Implementation of obstacle avoiding logic.
  6. Human Face Detection: Integration of a camera module for face detection and following using TensorFlow and various sensors.

These functionalities demonstrate the robot's versatility and the successful integration of multiple technologies and algorithms.

Limitations and Challenges

While the project achieved significant milestones, it also faced several challenges:

  1. Power Management: Balancing power consumption with performance was an ongoing challenge.
  2. Sensor Accuracy: Improving the accuracy and reliability of sensor readings, especially in varying environmental conditions.
  3. Processing Power: The Arduino's limited processing power occasionally constrained the complexity of implementable algorithms.
  4. Real-time Response: Ensuring quick response times for obstacle avoidance and face tracking in dynamic environments.

Addressing these challenges provided valuable learning experiences and insights into the complexities of mobile robotics.

Future Plans and Enhancements

The project team identified several areas for future development:

  1. GPS Integration: Adding a GPS module for real-time tracking capabilities.
  2. Wi-Fi Connectivity: Implementing a Wi-Fi module to enable global control of the robot.
  3. Advanced AI Algorithms: Exploring more sophisticated machine learning algorithms for improved decision-making.
  4. Enhanced Sensor Array: Incorporating additional sensors for more accurate environmental perception.
  5. Battery Optimization: Researching and implementing more efficient power management systems.

These enhancements would significantly expand the robot's capabilities and potential applications.

Conclusion

This undergraduate capstone project in mobile robotics and artificial intelligence successfully demonstrated the integration of hardware and software to create a functional autonomous robot. The project not only achieved its primary objectives of manufacturing a mobile robot and developing driving control algorithms but also explored advanced concepts such as computer vision and machine learning.

The experiences gained from this project provide a solid foundation for further research and development in the field of robotics. The challenges faced and overcome throughout the project timeline offered valuable insights into real-world engineering problems and their solutions.

As technology continues to advance, projects like this play a crucial role in preparing the next generation of engineers and researchers for the complexities of autonomous systems and artificial intelligence. The future enhancements proposed show the potential for continued innovation and improvement in mobile robotics.

This project stands as a testament to the power of hands-on learning and the importance of interdisciplinary approaches in engineering education. It not only contributed to the academic growth of the team members but also pushed the boundaries of what can be achieved in an undergraduate setting.

Post a Comment

0Comments

Post a Comment (0)

#buttons=(Ok, Go it!) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Ok, Go it!