Preview only show first 10 pages with watermark. For full document please download

Joseph Lloyd Ken Schneider

   EMBED


Share

Transcript

Joseph Lloyd [email protected] Ken Schneider [email protected] Scott Marchant [email protected] Crismar Mejia [email protected] February 10, 2011 Remotely Operated Vehicle Dr. Cripps, Attached is the final report for our Remotely Operated Vehicle project. We are sending you this document so you can review what we were able to do, and how we were able to realize the project. Your feedback concerning our project would be appreicated. Feel free to contact us at our emails provided. Regards, Joseph Lloyd Ken Schneider Scott Marchant Crismar Mejia RC Vehicle Imaging Joseph Lloyd Scott Marchant Ken Schneider Crismar Mejia May 6, 2011 Abstract This reports covers the summary and detail of our project which implements a remotely operated vehicle with added functionality for operating in low-visibility environments. The user drives the vehicle wirelessly with a joystick, and receives video and sonar feedback for navigation purposes. i Contents List of Tables v List of Figures vi 1 Introduction 1 1.1 Subject and Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Problem Statement and Design Objectives . . . . . . . . . . . . . . 1 1.3 Summary of Design Process . . . . . . . . . . . . . . . . . . . . . . 2 1.4 Summary of Final Results . . . . . . . . . . . . . . . . . . . . . . . 2 1.5 Organization and Summary of Report . . . . . . . . . . . . . . . . . 2 2 Review of Conceptual and Preliminary Design 3 2.1 Problem Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Review of Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.3 Summary of Specifications . . . . . . . . . . . . . . . . . . . . . . . 3 2.4 Discussion of Main Features of the Design Problem . . . . . . . . . 3 3 Basic Solution Description 3.1 6 Design Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Design Details 6 11 4.1 Description of Components . . . . . . . . . . . . . . . . . . . . . . . 11 4.2 Design Criteria Used . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.3 4.2.1 PC Software Design Criteria . . . . . . . . . . . . . . . . . . 14 4.2.2 ROV Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Drawings and Schematics . . . . . . . . . . . . . . . . . . . . . . . . 14 ii 4.4 5 4.3.1 Turret Mounted Sonar Sensor . . . . . . . . . . . . . . . . . 14 4.3.2 Bluetooth . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.3.3 Speed Encoder . . . . . . . . . . . . . . . . . . . . . . . . . 18 Construction Details . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.4.1 Truck Chasis and Body . . . . . . . . . . . . . . . . . . . . . 19 4.4.2 8051F020 & Protoboard . . . . . . . . . . . . . . . . . . . . 20 4.4.3 Stepper Motor & Sonar Sensor 4.4.4 Speed Encoder . . . . . . . . . . . . . . . . . . . . . . . . . 21 4.4.5 Wireless Camera . . . . . . . . . . . . . . . . . . . . . . . . 21 4.4.6 Summary of Final Design Results . . . . . . . . . . . . . . . 21 Implementation and Assessment 5.1 . . . . . . . . . . . . . . . . 20 24 Details of Implementation . . . . . . . . . . . . . . . . . . . . . . . 24 5.1.1 Graphical User Interface . . . . . . . . . . . . . . . . . . . . 24 5.1.2 Joystick Interface . . . . . . . . . . . . . . . . . . . . . . . . 25 5.1.3 Video Interface . . . . . . . . . . . . . . . . . . . . . . . . . 26 5.1.4 Bluetooth Interface . . . . . . . . . . . . . . . . . . . . . . . 28 5.1.5 Truck Interfacing with MCU . . . . . . . . . . . . . . . . . . 29 6 Final Scope of Work Statement 31 6.1 Summary of Completed Work . . . . . . . . . . . . . . . . . . . . . 31 6.2 Summary of Work Remaining . . . . . . . . . . . . . . . . . . . . . 31 6.3 Lesson Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 7 Other Issues 33 8 Cost Estimation 34 iii 8.1 Estimate of System Cost . . . . . . . . . . . . . . . . . . . . . . . . 34 8.2 Estimate of Design Cost . . . . . . . . . . . . . . . . . . . . . . . . 34 9 Project Management Summary 36 9.1 Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 9.2 Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 9.3 Personnel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 10 Conclusion 38 iv List of Tables 1 Stepper Motor Sequence . . . . . . . . . . . . . . . . . . . . . . . . 15 2 System Cost . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3 Design Cost . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 v List of Figures 1 Top Level Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Middle Level Diagram . . . . . . . . . . . . . . . . . . . . . . . . . 7 3 Sonar Sensor Schematic . . . . . . . . . . . . . . . . . . . . . . . . 15 4 Stepper Motor Schematic . . . . . . . . . . . . . . . . . . . . . . . . 16 5 Bluetooth Schematic . . . . . . . . . . . . . . . . . . . . . . . . . . 17 6 Speed Encoder Schematic . . . . . . . . . . . . . . . . . . . . . . . 19 7 Laptop-Joystick . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 8 Video Camera and Receiver . . . . . . . . . . . . . . . . . . . . . . 27 9 Bluetooth Module (mouser.com) . . . . . . . . . . . . . . . . . . . . 28 10 Gantt Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 vi 1 Introduction This report will cover the reasoning of having a remotely operated vehicle, and the process taken to achieve our final results. 1.1 Subject and Purpose The subject of our project is to layout a model of an enhanced real-world driving experience through a remote-control (RC) vehicle. The purpose is to demonstrate ways in which the driving experience can be enhanced to improve safety and comfort in driving, particularly at night. 1.2 Problem Statement and Design Objectives With the increasing deer population throughout the United States, the rate of accidents on U.S. highways caused by deer is also on the rise. State Farm Insurance Agency reports that, in 2004, there were 150 human deaths in the United States related to auto-deer collisions (Bell). Cadillac was the first car maker to create a solution for this type of problem with night vision technology introduced on the 2000 DeVille (Cadillac Becomes First to Offer Night Vision Technology). Our project needs to be able to model this real-world problem and its solution. Requirements that our solution needs to meet include: • Significantly increase worse-case night time visibility. • Do not negatively impact visibility in any conditions. • Accurately model potential hazards to driver. 1 1.3 Summary of Design Process This design will be implemented through the basic structure of a PC, embedded processor, sensors, and actuators. The elements, working together, will present the user with an overhead view of obstacles that will present hazards to the vehicle. Also, an infrared video feed will be used to increase visibility directly in front of the vehicle. Other components, such as proximity sensors, will be installed to ensure vehicle safety and user comfort-ability. 1.4 Summary of Final Results Section 6 will discuss the final results of our design. Overall, we were happy and felt that our implementation of the design was a success. 1.5 Organization and Summary of Report This report will discuss and explain the various parts of the design from inception to implementation. We will start with section 2.0 which will discuss the concepts that went into our preliminary design. We will then move into the description of our solution followed by the details of our design. Section 5.0 will go over the implementation of the design. The concluding sections will then cover where the implementation is at and project management concerns, such as: known issues, estimated costs, and time tables. 2 2 Review of Conceptual and Preliminary Design 2.1 Problem Analysis 2.2 Review of Problem Everybody knows that driving in the dark is more dangerous than driving during daylight hours. The main goal of our project is to reduce the hazards associated with low-light driving. We have done this by creating two extra visual interfaces for the driver to refer to: a long range sonar turret and a shorter range infrared camera. In order to test this system, we also had to create a mobile platform to test it on. To serve this purpose, we modified a radio controlled truck to be controlled with a microcontroller that is in constant contact with a PC via a bluetooth signal. 2.3 Summary of Specifications The remotely operated vehicle (ROV) will have to be able to navigate efficiently through and obstacle course in a dark room. The truck will be controlled from a computer in a different room. The interface needs to be easy enough to use that an average car owner can operate the ROV. There is also a strict $500 dollar budget allowed for the project. 2.4 Discussion of Main Features of the Design Problem There are four major components in this system: the sonar turret, the truck and all of its parts, wireless communications, and the PC interface. A brief summary is 3 provided here, and each of these will be discussed in detail later. The sonar turret consists of a single sonar sensor mounted on the output shaft of a stepper motor. The motor sweeps the sensor through a 180◦ arc, pausing every few degrees to let the sensor collect a sample. We had originally planned on having the turret provide a full 360◦ long range view around the truck, but limitations in the sample rate of the sonar sensor limited our possible resolution so low that we decided to have a higher resolution 180◦ field of view in front of the truck. The truck consists of four major components, with two other components attached to it. The major truck part (besides the truck itself) is the microcontroller, which controls the bluetooth module on the truck, the steering servo, the main drive motor controller, the sonar turret, a proximity sensor, and a speedometer. It also ferries data from all the sensors to the PC through the Bluetooth module. The steering servo and motor controller receive PWM signals from the microcontroller and provide the necessary voltage and current to steer and drive the truck. A proximity sensor mounted above the front bumper of the truck creates an emergency override that applies the brakes if it senses an object in front of the vehicle. Mounted on a rear wheel are four small magnets that trigger a hall effect sensor. This data is sent to the PC and is converted into a linear speed and displayed on the GUI as a speedometer. The Bluetooth module mounted on the truck communicates with the module built into an average laptop. This connection carries both control information for the truck from the PC and sensor data to the PC from the truck. 4 Original plans for wireless communications were to use a WiFi transmitter/receiver so that the truck could be controlled anywhere on campus, but in order to keep costs lower we decided to use the limited range Bluetooth module instead. A PC does the bulk of all the calculations on data received from sensors on the truck. The PC receives input from a joystick and converts it into control information to send through the Bluetooth link. It also displays the two visual interfaces for the driver in the forms of a radar-like screen from the sonar turret and video from the camera. 5 3 Basic Solution Description This section outlines a basic description of the design and it functionality. It is perhaps easiest to present and understand the design in a top-down approach. Section 3.1 outlines the design from the top level down to, but not including, the component selection. Component selection and schematics are covered in detail in section 4.0. Section 3.2 covers the basic theoretical background that empowers the design. Section 3.3 outlines performance expectations, or criteria that the final design must meet. Section 3.4 details component specifications, or criteria that the components must meet. 3.1 Design Overview Figure 1: Top Level Diagram The design can be partitioned into two major sub-components, the PC, and the ROV. These functional partitions are shown respectively in Figure 1. The user is able to control the ROV using the PC. Additionally the user is provided with visual feedback on the PC monitor that allows them to navigate the ROV from a remote location (somewhere where they could otherwise not see the vehicle, 6 such as in a different room). The interaction between the user, PC, and ROV is better described in a more detailed block diagram, as shown in Figure 2. Here the PC function block has been divided into block 3.1.1 - 3.1.5, as shown on the left side of the diagram. On the right side, blocks 3.1.6 - 3.1.15 constitute the ROV functional partitioning. The functionality and interaction of these blocks is described in detail in the following list. Figure 2: Middle Level Diagram Block Descriptions • 3.1.1 - PC: This block represents the personal computer hardware. It receives user input from block 3.1.3, and provides user output to the computer monitor, block 3.1.2. • 3.1.2 - Display: This block represents the computer monitor. This block 7 receives input from the PC (3.1.1) and relays visual information to the user. The user sees this block as the Graphical User Interface (GUI). • 3.1.3 - Joystick: This block represents the joystick. It is the primary means for the user to control the ROV. The joystick readings are relayed to the PC (3.1.1) via a USB connection. The PC then passes on corresponding control information to the ROV • 3.1.4 - Video Converter: This block represents a video converter that converts the composite video stream from the Video Camera Receiver (3.1 5) into an NTSC standard video streaming format. This video stream is relayed to the PC via a USB connection. • 3.1.5 - Camera Receiver: This block represents the wireless video receiver. The receiver receives a wireless video stream from the video camera (3.1.15), and relays a composite video feed to the video converter (3.1.14). • 3.1.6 - 8051 Microcontroller: This block is the 8051 microcontroller, the brain of the ROV that interfaces with every ROV component except the video camera. The microcontroller serves as a translator between the computer and the other ROV components. This is described in more detail in the remaining functional block descriptions. • 3.1.7 - Bluetooth: This is the Bluetooth Module. It interfaces wirelessly with the PCs built-in bluetooth module to relay data between the PC and the microcontroller. • 3.1.8 - Proximity Sensor: This bock represents a proximity sensor that is used to detect whether the front side of the ROV is too close to an object. 8 One embodiment of this block might include an infrared sensor. The block relays this information to the microcontroller. • 3.1.9 - Sonar Sensor: The sonar sensor is the flagship sensor of the ROV. In and of itself, it is just a typical sonar sensor that measure distances from itself to an object in front of it. However, 3.1.9 is mounted on a stepper motor (3.1.14) to form a turret. In this manner, the sonar sensor can be rotated to take distance. measurements from multiple angles. The sonar sensor relays its distance measurements to the microcontroller via an analog signal that is proportional to the distance. • 3.1.10 - Speed Encoder: The speed encoder detects magnets that are placed on one of the ROVs wheels. It relays this information to the microcontroller, which uses the data to determine the speed at which the wheels are rotating. • 3.1.11 - Speed Controller: The speed controller accepts a Pulse-Width Modulate (PWM) signal from the microcontroller, and relays a corresponding signal to the Drive Motor (3.1.12). • 3.1.12 - Driver Motor: The Drive Motor receives an input voltage from the Speed Controller (3.1.11) and turns the ROVs wheels at a corresponding speed, either forward or backward. • 3.1.13 - Steering Servo Motor: The Steering Servo Motor accepts a PWM signal from the microcontroller in an almost identical manner to the Speed Controller (3.1.11). The servo then correspondingly moves the ROV steering shaft to turn the front wheels left and right. 9 • 3.1.14 - Stepper Motor: The microcontroller relays steps as an input to the Stepper Motor. The motor then rotates through each one of these steps. A step can be thought of in this instance as a small increment or decrement in angular position. • 3.1.15 - IR Camera: The IR camera is, in essence, a video camera. It can be modified to see the infrared spectrum in addition to the visible light spectrum by removing the visible light filter. The camera has built in wireless communication to relay its images to the wireless receiver (3.1.5) using a custom protocol. 10 4 Design Details This section delves into the details of the design described in the previous section. It starts by describing each component specification in 4.1 and continue to talk about our design criteria, and our schematics. 4.1 Description of Components Vehicle & Motor Controller The radio controlled truck we chose for this project is a first generation Traxxas Stampede. It was bought used on eBay for about $80. There are two main reasons why we chose this truck. First, the vehicle has a modular design so that any component can easily be replaced or modified. Second, this truck is very durable. They are built to withstand large amounts of abuse without breaking. This truck is also electric, making indoor operation possible. It can also easily be driven in rough terrain. The truck runs on a 7.2 volt battery, which is a high enough voltage to power all of the equipment we mounted on the vehicle. The Stampede originally came with what Traxxas calls an analog speed controller to control the trucks main drive motor. Its basically a servo that turns a very low resolution potentiometer. This system is extremely inefficient (it requires two huge resistors that produce massive amounts of heat) and didnt allow the speed sensitivity needed for safe indoor operation. To fix this, we installed a Traxxas XL-5 electronic speed controller. This product is what comes standard on the newest generation of Traxxas electric radio controlled vehicles. The XL-5 has very high resolution, which allows for the speed sensitivity we need. 11 Micrcontroller The C8051F020 microcontroller is produced by Silicon Labs based on Intels 8051 Harvard architecture. It features a pipe-lined core capable of attaining up to 25 MIPS, 4352 bytes of on-chip RAM, 64k bytes of flash memory, and non-intrusive debug interface. This microcontroller features many useful peripherals. The ones we are using are: • True 12-Bit 100 ksps 8-channel ADC with PGA and analog multiplexer. • True 8-bit ADC 500 ksps 8-channel ADC with PGA and analog multiplexer. • Two 12-bit DACs with programmable update scheduling. • One of the UART serial interfaces implemented in hardware. • All five general purpose 16-bit Timers Hall Effect Sensor & Magnets The Melexis US1881 is a Hall effect latched sensor. It integrates a voltage regulator, hence it has a wide operating voltage range from 3.5V to 24V. A Hall sensor with dynamic offset cancellation a Schmitt trigger, and an open-drain output capable of sinking up to 50mA. The magnets are composed of composed of Neodymium/Iron/Boron (NdFeB) and are 0.125 cube. They activate the Hall Effect sensor at a distance of 1cm. PC To properly execute the GUI, there are a number of setup requirements. The computer must have a Class II Bluetooth module for a range of about 30 feet, or a Class I Bluetooth module for a range of about 200 feet. Additionally the computer must have Java SE installed, as well as certain runtime libraries. The installation of these libraries is described in detail in the remainder of this section. 12 Finally, the computer must be running the Ubuntu 11.04 32-bit Operating System. Three different runtime libraries must be installed in order for the GUI to properly execute, namely the video, joystick, and serial communication libraries. To install the video libraries, first download libvideo0 i386 and libv4l4j i386 from http://code.google.com/p/v4l4j/downloads/list. Because these libraries are Debian packages, they can then easily be installed by opening them with Ubuntu Software Center and clicking the Install button. The installation of the joystick and serial libraries is somewhat less trivial. The joystick library is distributed in the form of the file jinput.jar and libjinpu-linux.so. The serial libraries consist of the RXTXcomm.jar, librxtxParallel.so, and librxtxSerial.so files. The .jar files must be placed in the /usr/lib/jvm/java-6-openjdk/lib directory, and the .so files must be placed in the /usr/lib/jni directory. Setting up the libraries for use in an Integrated Development Environment (IDE) is dependent on the IDE and not required to run the GUI executable; therefore, this type of setup will not be covered in this document. However, if the reader is interested, a good tutorial on setting up the video library for use in compiling code can be found at the v4l4j Google Code Website, http://code.google.com/p/v4l4j/wiki/IDESetup. The setup for the joystick and serial libraries is similar. 4.2 Design Criteria Used As described in the Design Overview (Section 3.1) our design is broken in two different sub-components, PC and ROV. Following this breakdown, each component has a different design criteria. 13 4.2.1 PC Software Design Criteria Since a PC has a much greater processing power than a microcontroller we approached the design of our PC software with this idea in mind. Hence, our time constraints are more loose as the computer is able to process data faster. Also, most computations that can be done in either the microcontroller or the PC are performed in the PC for the same reasons. So thats why filtering of the video, and radar calculations are done in the PC. 4.2.2 ROV Criteria Our main concern on the ROV subcomponent, as opposed to the PC subcomponent, is speed and latency. Our criteria for making design decisions was to attain the maximum speed possible without compromising the ROV design performance. 4.3 Drawings and Schematics In this section we present our schematics and diagrams for the different blocks of our design. 4.3.1 Turret Mounted Sonar Sensor Sonar Sensor The sonar sensor is powered from a 3.3 V supply. When Rx is pulled high, the sensor will continuously sample distances. This is the default setup of the sonar sensor. However, our design requires us to synchronize the distance measurements with the position of the stepper motor. Therefore, Rx is tied to Port 2.6 which will pulse a high signal when a reading is necessary. The sensor outputs the data on an analog voltage. This voltage is converted to a digital value 14 in the analog to digital converter (ADC) on the microcontroller. Figure 3: Sonar Sensor Schematic Stepper Motor The stepper motor operates from electromagnetic theory. Cycling through the pattern in Table 1 will cause the stepper motor to rotate clockwise (or counter clockwise if cycled through the opposite direction). Darlington transistors are utilized to amplify the current that flows through the stepper motor to enable it to rotate. The output of the darlington array is an amplified and inverted copy of the input. Table 1: Stepper Motor Sequence Step Black Brown Orange Yellow 1 On Off On Off 2 Off On On Off 3 Off On Off On 4 On Off Off On 15 Figure 4: Stepper Motor Schematic 4.3.2 Bluetooth Bluetooth Hardware For serial communication, the RN-41-SM Bluetooth Module was selected. The module has the following features making it desirable over other components we considered: • High throughput rate of up to 3 MBit/s • Class 1 Bluetooth Radio (330 ft. theoretical range) • 2.4 GHz Operating frequency • Low, $45 unit cost • Built-in Bluetooth and UART network stacks • Bluetooth Serial Port Profile PC interface 16 The module is connected to the C8051F020 as shown in Figure 5 The pins shown in the header on the left are those of Header B on the RN-41-SM Module. In addition to connecting the module to the 8051, one must first remove resistors R6 and R8 on the back of the module. For more information on the location of these resistors, please to the RN-41-SM Bluetooth Module datasheet. This concludes the hardware details necessary to setup the Bluetooth serial communications link. Figure 5: Bluetooth Schematic Bluetooth Firmware The bluetooth firmware refers to the 8051 firmware necessary to drive the module. The module can be accessed similar to any other UART device. As it is assumed the reader is familiar with this process, and as example source code abounds for this topic, only the basic details will be described here. For a full listing of the source code to communicate with the Bluetooth Module, please refer to CD submitted to ECE office. 17 First the P0.0 and P0.1 must be configured as push-pull ports. Then Timer1 and UART must be enabled and properly configured to communicate with the module. A 125200 baud rate is used. Again greater detail can be had in referencing CD submitted to ECE office. After everything is properly set up, the UART 1 interrupt (interrupt 20) will be triggered for every byte received and transmitted. To send a byte, use SBUF1 = ByteValue;. To receive a byte, use Byte value = SBUF1;. Bluetooth Software Once the Bluetooth module is powered on, it can be paired with the computer just like any other Bluetooth device. The default passcode for the module is 1234. Once paired, communicating with the module is very similar to communicating with any other Serial Peripheral Interface. Although most may be familiar with this process, it is nonetheless presented here for documentation purposes. This particular method of serial communication will be implemented using the Java language. The RXTXcomm library is extremely useful and can be downloaded from http://rxtx.qbang.org. Installation instructions are available at the same website and will not be included here. 4.3.3 Speed Encoder For the speed encoder we decided to use the typical application 3-wire circuit found in the datasheet since its more than enough to accomplish our speed measuring task. 18 Figure 6: Speed Encoder Schematic The output of this circuit is, as described earlier, a Schmitt trigger with open collector. This output is connect to port 0 in the microcontroller where it is used to trigger the external interrupt of the microcontroller. Each time, a high-level is reached the microcontroller saves the value of timer 2 and restarts the count. This saved value is then used to determine the speed at which the vehicle is currently traveling. 4.4 Construction Details Here is an overview of the construction steps and some details. For simplicity most of the installation was accomplished using hot glue. 4.4.1 Truck Chasis and Body In order to build our vehicle we had to strip the original truck down. First off on the chassis, we took out the transmitter since it was not needed. Later on, as we realized the original speed controller was not able to control speed we removed 19 it from the vehicle, and installed the current one. Due to the type of material the body of the truck is made of it was necessary to minimize modifications to it in order to main it strong. Hence only two modifications were made to it: 1. Two holes were made in the front to insert the infrared lights. 2. A small slit was cut in the bottom back-window where the bed of the truck starts. This slit was made to for cable of the turret sonar sensor. The turret hole already existed. It was were the original antenna was mounted. 4.4.2 8051F020 & Protoboard The 8051F020 board was installed between the front poles that hole the body of the vehicle. To accomplish this a layer of plastic of the same size of the board was screwed along side the board for protection. The small protoboard that contains our circuitry was had an adhesive on the bottom. It is installed on top of the 8051F020 board. 4.4.3 Stepper Motor & Sonar Sensor As mentioned in 4.4.1, the hole we are using for the turret is the one were the original antenna was installed. The turret was fixed from inside the body so that only the rotor of the stepper motor stick out in the roof. The motor was affixed to the body using hot glue. Ideally, screws would have been the perfect match but as mentioned earlier that would have weaken the structure of the body further. 20 The sonar sensor is installed on the rotor of the stepper motor. It was affixed using hot glue. 4.4.4 Speed Encoder To install the speed encoder the back left wheel had to be removed from the vehicle. Four magnets where installed inside the rim on this wheel using hot glue. The process, however, was slow because the magnets had to be tested as they where installed against the sensor to ensure proper functioning. The magnet was installed on the axle that holds the wheels. This makes the reading point be in the top of the wheel. Hot glue was used to affix the sensor. 4.4.5 Wireless Camera The wireless camera is installed in the front of the vehicle, below the body. The base of the camera was affixed to the chassis using hot glue. The 9V battery that the camera uses, was set on a enclosure of the chassis that made it hold still. Needless to say, the battery was not glued to the vehicle as it needs to be replaced. 4.4.6 Summary of Final Design Results Motor Speed and Direction Control The motor speed and direction control worked flawlessly. The biggest issue was figuring out how to make them work properly as documentation on both speed controller and servo motor controller was rather poor. Once this was done, the vehicle drove perfectly. 21 Turret Sensor The stepper motor worked as expected. It stepped through all 24 angles at a time and then went back to the starting angle and started all over. The calibration algorithm implemented by Joseph did a great job at finding the starting angle from any position it found itself when the vehicle is started. The only issue we found was that it was heating up fast so we had to let it cool down between tests in order to prevent it from getting damaged by heat. On the other hand, the sonar sensor did not live up to our expectations. The readings had a jitter of about 5 feet hence rendering the ability to discern an object nearly impossible. There are a couple of factors that could be deterring our readings. These are: • Part of the sensor encapsulation got melted during installation. • The vibration caused by constant might be throwing us off. In the end, we came to the conclusion that either a more expensive sonar is needed to be able to perform the task or a different sensor. Speed Encoder The speed encoder performed as expected. There was an issue with the crossbar that almost prevented the encoder from being realized. However, in the last moments we came to the conclusion that our latency increased greatly with the addition of the encoder hence the driving commands had a noticeable delay. Due to this we determined not to send the encoder data and to determine the speed from the joystick. Bluetooth Module This was also a successful piece. The only issue we had was, as we just mentioned, the noticeable control delays that occurred with the 22 encoder. Yet, this problem is not inherent to the module but to the code. Also a minor issue was the range but not that of the module but of the computer paired with the module. This happened because the computer module(which is internally built-in) was a class 3 instead of class 1 hence, it had bigger bandwidth but less range. For demonstration purposes this was not an issue. If bigger range was needed, the solution would be to use a usb bluetooth module on the computer that is the same class as the module on the car. That is a class 1 usb module. Video Results Video results were as expected. We were able to see obstacles in both daylight and a dark room. The only major issue was the fact that camera depleted the battery very fast. Also, a minor issue was the view angle of the camera. A bit wider would have made driving the easier as the dimensions of the vehicle go further than the view angle does. Radar Our radar was successful. We were able to map the points send by the sonar successfully. The fact that the points are not accurate is not determined by the radar. One minor issue we had was that if the video and the radar where in the same window the video would flicker. This issue was solved by making the video and radar be on different windows. Joystick Joystick was also successful. 23 5 Implementation and Assessment This section covers how we decided to implemented each part of our system to meet the demands of you system. We will also cover how succesful we were at accomplishing our intended design. 5.1 Details of Implementation The following sections cover the details of how we implemented each component. 5.1.1 Graphical User Interface Overall, the PC was implemented using Java hence for the GUI we used Javas Swing API. In order to implement the GUI, we had to pay around with Javas drawing primitives to determine how different primitives are drawn by java. Then we had to determine the correct trigonometric equations that would draw the points of the radar correctly as the screen axis is inverted in Java. Once this is done the rest of the radar was easy. Another frame(window) was created for the video. Using the API provided by the camera vendor we were able to decode the video easily. After that, all that had to be done was to display the video on screen. For this matter we created a different window and displayed the video there. A different window as created to prevent the video from flickering. 24 Figure 7: Laptop-Joystick 5.1.2 Joystick Interface Another crucial component of the design is the joystick interface. The user uses the joystick to navigate the ROV. The joystick interfaces with the PC via USB. Perhaps the most challenging problem of interfacing the joystick and PC is finding code libraries that facilitate easy reading of the joystick position. The Jinput from newdawnsoftware.com serves exactly such a purpose. Once the library is successfully installed onto the operating system, the following code can be used to connect to and read the joystick in Java SE: 25 5.1.3 Video Interface The video-to-pc interface is perhaps one of the many tricky parts of the design. The first problem is that the video output of the wireless video receiver is composite video, which is very friendly to television sets, but not very friendly at all to computers. To resolve this problem, we used a composite video to usb converter. 26 Figure 8: Video Camera and Receiver Unfortunately, the video converter presented another problem in and of itself: poor documentation. It was made to work with vendor software and vendor software only. No efforts were made to help third parties interface with the device. Despite this fact, it is reasonable to assume the engineers who made the device resorted to some type of video standard. This assumption proved correct. The device outputs a standard NTSC video format that is easily (the term easily is used very loosely here) captured in Linux by reading the /dev/video0 directory with video4linux libraries. 27 5.1.4 Bluetooth Interface Interfacing the PC to the MCU via a wireless connection is fairly straightforward given a little information. Because engineering an entire wireless system is beyond the scope of the project, we looked, instead, for an existing wireless system that could serve as a drop-in wireless replacement for a serial cable connection. Our first intuition was to resort to some type of WIFI module, such as the WiFly GSX. However, after research, we found a bluetooth module that would serve the same purpose for nearly half the price. This came at a small reduction in bandwidth, but the price reduction was more than enough justification to warrant the compromise. Figure 9: Bluetooth Module (mouser.com) In the end, we selected the RN-41-SM bluetooth module from Roving Networks. The only modification necessary to interface the module with the C8051F020s UART bus was removing a couple shorting resistors (this is described in greater detail in the previous section). In the end, we obtained an easy-to-use wireless replacement for the serial connection. One challenge with interfacing with the bluetooth module is finding a good serial library to communicate with the device in Java code. The RXTXcomm library from rxtx.qbang.org serves this purpose quite nicely. Once it is properly installed, 28 the following code can be used to communicate with the bluetooth module (For a full listing of the code, see CD submitted to ECE office): 5.1.5 Truck Interfacing with MCU Interfacing the truck with the microcontroller turned out to be harder than expected, not because of anything technically challenging, but because of misleading information from product manufacturers. The electronic motor controller we are using is designed as a direct replacement for a Futaba servo, which has a neutral PWM pulse width of 1.5 microseconds (µs). Unfortunately, the neutral pulse for the motor controller is not 1.5 µs as the manufacturer for both the Futaba servo and the motor controller itself stated, but exactly 1.42 µs. Since the motor controller has a built in safety feature that prevents it from turning on until it receives the correct neutral pulse, finding the exact pulse width was critical to being able to test our system, however it proved quite time consuming. Another problem with interfacing the microcontroller with the truck was creating the correct voltage levels for everything. Running a line directly off of the main 7.2V battery to power the microcontroller was fairly simple. We also had to increase the voltage of the signal from the microcontroller to the steering servo and motor controller though. This required use of a Darlington pair. Since the 29 motor controller was designed to power the servo already, it has a regulated 6V supply line that was used to power both the Darlington pair and the steering servo Turret/Peripheral Interfacing with MCU The turret was made using a sonar sensor attached to a stepper motor. At each step that the stepper motor takes, the sonar takes a distance reading. This reading is then sent to the Bluetooth module to be sent to the computer. In order to prevent wires from snapping, the turret was restricted to only turning 180◦ . Once the turret reached this limit, it would spin back around to its starting position. This required us to manually calibrated the turret each time it would start up to ensure that we were sending 180◦ in front of the truck, and ensuring that position zero is inline with the front doors pointed at the drivers door. 30 6 Final Scope of Work Statement 6.1 Summary of Completed Work After a lot of work and debugging, we were able to navigate the vehicle both in dark and light environments. We discovered that the range and quality of the video feed is dependent on the voltage supply to the wireless camera. We used a 9V battery to power the video camera, but a different battery with greater current capacity may be desirable for better video quality and range. Nonetheless, the video quality was adequate enough to successfully navigate the ROV. The sonar turret was also successfully implemented. The sensor, mounted on a stepper motor with hot glue, rotated through a series of steps back and forth repeatedly. Unfortunately, due to interference from the vibrations of the stepper motor and the rotating of the sonar emitter, the sensor was unable to provide accurate readings while rotating. When stationary, the sensor provided very accurate distance readings. 6.2 Summary of Work Remaining A few portions of the proposed design remain to be implemented. We were able to integrate speed encoder readings into the design; however, it was decided at the last minute to refrain from including the data in the Bluetooth data stream to reduce latency in the ROV control loop. More time to debug and optimize communication network packets would allow complete integration of the speed 31 encoder. 6.3 Lesson Learned The most valuable lesson that we learned from this project learning how to interface with proprietary components. We ran into a lot of problems at first trying to interface the various components with the 8051. This required us to step back and take another look at how we could reverse engineer, and attack the problem in a different way. Our eyes were opened to different avenues to solving problems and a look at how companies engineer and manufacture their products. 32 7 Other Issues Component or material suppliers In order mass produce this device it would be necessary to find reliable manufacturers that can guarantee good parts and good documentation throughout the lifetime of the project. Safety In order to have the project be safe, we decided to limit the speed of the vehicle to make maneuverability easier. In the event of a crash, this decision also prevents the vehicle from being wrecked as significant, dangerous speed is never attained. Additionally, the camera and the radar mounted on the vehicle are meant to increase driver awareness of her environment hence increasing safety. Societal and Economical Impact This project will have a good impact on society because it will reduce deer collisions significantly. Similarly, it will have a good impact on wildlife as less deers will die from crashes. On the economic side, it will also have a positive impact as less money in spend on car repairs after a collision and as cities spend less money removing dead deers from roads. 33 8 Cost Estimation This section covers cost estimations for the design of the system and the cost to commercially mass produce the system. 8.1 Estimate of System Cost Table 2 lists the items required for building this system, and approximately how much each unit would cost if we were able to mass produce them. Table 2: System Cost Component IR Camera RC Car Stepper Motor Motor Controller Sonar Sensor IR Sensors (x2) Hall Effect Sensor Square Magnets (x4) Bluetooth Misc Material Labor (< 1 hour) Total 8.2 Price $15 $80 $5 $40 $25 $12 $1 $1 $38 $5 $6 $228 Estimate of Design Cost Table 3 lists each of the elements from the above table that were required to design the system. As seen, the cost is a lot higher due to each item cost more because we were not buying in bulk. Also, the bulk of the cost listed here is the 34 time required for us to design the system. This cost may or may not perpetuate if mass producing due to the products support lifetime. Table 3: Design Cost Component Price IR Camera $38 RC Car $80 Stepper Motor $10 Motor Controller $40 Sonar Sensor $32 IR Sensors (x2) $30 Hall Effect Sensor $3 Square Magnets (x4) $6 Bluetooth $45 Misc Material $20 Labor (800 hours) $16,000 Total $16,304 35 9 Project Management Summary This section goes over the management resources required to successfully com- plete this project in a timely manner. 9.1 Tasks The majority of the tasks required for completing this project have been met. The following tasks have or have yet to be started, but are not completed. • Mounting turret to truck • Testing communicating turret data to PC • Proximity Sensors 9.2 Timeline The following chart shows the general tasks involved in getting this project done. The chart is current as of April 26, 2011. Figure 10: Gantt Chart 36 9.3 Personnel In order to complete this design, we needed a blend of computer and electrical engineers. The combination helped us to cover programming to hardware requirements. Crismar added the needed computer engineering blend that was needed to complete our project. He was primarily responsible for overseeing the computer programming and interfacing that was needed. Scott assisted Crismar and brought his expertise to the table which involved interfacing our components to the PC. Both Crismar and Scott worked on processing data, displaying that data to the user, interfacing the joystick as well as the bluetooth module to the computer. Ken brought his knowledge of mobile robots and radio controlled vehicles to the table. His knowledge was vital for us to communicate and control the truck. Joseph was primarily responsible for our turret design and construction. 37 10 Conclusion In conclusion, entities that a driver cannot see directly but that could eventually get in the way represent a very hazardous situation that must be solved. To address this problem, we created a sonar turret that scans the area 180◦ ahead of the front of the vehicle. We also designed a user friendly GUI to relay the information to the vehicle operator. In a 13 week period, using a budget of $304, we were able to design and test the proposed product. Tests undertaken have show that our project could potentially be a basis for a mass produced device that will make driving safer while enhancing the drivers experience. Further tests should smooth out the finished product to create something that the average car owner would find helpful in low light driving. 38