Development Of Autonomous Downscaled Model Car Using Neural Networks And Machine Learning

Machine learning using convolution neural network Required: raspberry pi pi cam compatibile rc car motor driver l293d Please create the respective files: forward idle left right reverse optimized_thetas This project aims to build an autonomous rc car using supervised learning of a neural network with a single hidden layer. We have not used any Machine Learning libraries since we wanted to implement the neural network from scratch to understand the concepts better. We will be referring the DC motor controlling the left/right direction as the front motor and the motor controlling the forward/reverse direction as the back motor. Connect the BACK_MOTOR_DATA_ONE and BACK_MOTOR_DATA_TWO GPIO pins(GPIO17 and GPIO27) of the Raspberry Pi to the Input pins for Motor 1(Input 1, Input 2) and the BACK_MOTOR_ENABLE_PIN GPIO pin(GPIO22) to the Enable pin for Motor 1(Enable 1,2) in the L293D Motor Driver IC. Connect the Output pins for Motor 1(Output 1, Output 2) of the IC to the back motor. Connect the FRONT_MOTOR_DATA_ONE and FRONT_MOTOR_DATA_TWO GPIO pins(GPIO19 and GPIO26) of the Raspberry Pi to the Input pins for Motor 2(Input 3, Input 4) in the IC. Connect the Output pins for Motor 2(Output 3, Output 4) of the IC to the front motor. The PWM_FREQUENCY and INITIAL_PWM_DUTY_CYCLE represent the initial frequency and duty cycle of the PWM output. We have created five class labels namely forward, reverse, left, right and idle and assigned their expected values. All class labels would require a folder of the same name to be present in the current directory. The input images resize to the dimension of the IMAGE_DIMENSION tuple value during training. The LAMBDA and HIDDEN_LAYER_SIZE values represent the default lambda value and the number of nodes in the hidden layer while training the neural network. All these values are configurable in configuration.py. The images for training are captured using interactive_control_train.py, the car is controlled using the direction arrows and all the images are recorded in the same folder along with the corresponding key press. After segregating the images into their corresponding class folders, the neural network is trained using train.py which takes two optional arguments - lambda and hidden layer size; default values would be those specified in the configuration file. At the command prompt, run the following command Once we have the trained model, the RC car is run autonomously using autonomous.py which takes an optional argument for the trained model; default will use the latest model in the optimized_thetas folder. Please feel free to post your doubts on code through my linkedin link: edin.com/in/shreyas-ramachandran-srinivasan-565638 CONTROLLING THE CAR The controlling process consists of 4 parts:  The sensor interface layer includes various programming modules worried about getting and time stamping all sensor information.  The discernment layer maps sensor information into inward models. The essential module in this layer is the PI camera, which decides the vehicle's introduction and area. Two distinct modules enable auto to explore in view of ultrasonic sensor and the camera. A street discovering module utilizes the PI camera determined pictures to discover the limit of a street, so the vehicle can focus itself along the side. At last, a surface evaluation module separates parameters of the present street to determine safe vehicle speeds.  The control layer is in charge of managing the controlling, throttle, and brake reaction of the vehicle. A key module is the way organizer, which sets the direction of the vehicle in controlling and speed space.  The vehicle interface layer fills in as the interface to the robot's drive-by-wire framework. It contains all interfaces to the vehicle's brakes, throttle, and controlling wheel. It likewise includes the interface to the vehicle's server, a circuit that manages the physical capacity to a significant number of the framework segments. In the proposed system, the raspberry Pi is used to control the L293D board, which allows motors to be controlled through the raspberry pi through the pulses provided by it. Based on the images obtained, raspberry pi provides PWM pulses tocontrol the L293D controller. L293D is a 16 Pin Motor Driver IC as shown in Figure 9. This is designed to provide bidirectional drive currents at voltages from 5 V to 36 V. Fig 9 L293D Breakout Board It also allows the speed of the motor to be controlled using PWM. It’s a series of high and low. The Duration of high and low determine the voltage supplied to the motor and hence the speed of the motor. PWM Signals: The DC motor speed all in all is specifically relative to the supply voltage, so if lessen the voltage from 9 volts to 4.5 volts, then our speed turn out to be half of what it initially had. Yet, for changing the speed of a dc motor we can't continue changing the supply voltage constantly. The speed controller PWM for a DC motor works by changing the normal voltage provided to the motor.The input signals we have given to PWM controller may be a simple or computerized motion as per the outline of the PWM controller. The PWM controller acknowledges the control flag and modifies the obligation cycle of the PWM motion as indicated by the prerequisites. In these waves frequency is same but the ON and OFF times are different. Recharge power bank of any capacity, here, 2800 mAH is used (operating voltage of 5V DC), can be used to provide supply to central microcontroller. The microcontroller used will separate and supply the required amount of power to each hardware component. This battery power pack is rechargeable and can get charged and used again and again.
Alternatives To Development Of Autonomous Downscaled Model Car Using Neural Networks And Machine Learning
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Permissionskit5,426
24 months ago67November 01, 20212mitSwift
Universal API for request permission and get its statuses.
Picamera1,487
a year ago269bsd-3-clausePython
A pure Python interface to the Raspberry Pi camera module
Rpi_cam_web_interface1,376
2 years ago279mitPHP
A web interface for the RPi Cam
Rximagepicker1,143
2 years ago19mitKotlin
:rocket:RxJava2 and RxJava3 external support. Android flexible picture selector, provides the support for theme of Zhihu and WeChat (灵活的Android图片选择器,提供了知乎和微信主题的支持).
Pkshortvideo456
7 years ago4September 27, 20168mitObjective-C
A video library like WeChat short video for iOS.
Python Gphoto23312066 months ago58September 02, 20232lgpl-3.0C
Python interface to libgphoto2
Tsugite310
2 years ago11otherPython
This is the repository of a Tsugite. It is a research prototype of an interactive software that supports the design and fabrication of wood joints.
Spacepuppy Unity Framework 3.0309
2 years ago3C#
A framework of various tools to facilitate making games in Unity2017
Goprohero257
67 years ago6December 15, 202114apache-2.0Python
A Python library for controlling GoPro cameras over http.
Raspberryipcamera112
5 years ago10gpl-2.0PHP
Raspberry IP Camera configurable through PHP web interface
Alternatives To Development Of Autonomous Downscaled Model Car Using Neural Networks And Machine Learning
Select To Compare


Alternative Project Comparisons
Popular Interface Projects
Popular Camera Projects
Popular User Interface Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Raspberry Pi
Car
Vehicle
Pwm