Showing posts with label Microcontroller projects. Show all posts
Showing posts with label Microcontroller projects. Show all posts

Tuesday, March 13, 2012

My Remote's a Pain in the Hand


I am starting to hate my Samsung remote a lot more than before and being a TV addict I guess the design of a TV remote is worth sweating over. I have seen many Japanese/Chinese remotes using the NEC protocol and have a far better design when it comes to small TV's. I can see that the ordinary remotes have a limited functionality and hence dont require a lot of buttons. Now ,incase of Samsung LCD TV there are whole lot of options within the TV like HDMI ,Media Play, Media Controls...etc. So, many buttons are properly spaced and have been placed on the remote control. However, the ergonomics of the device haven't been studied extensively. I believe that its time to move ahead and try and evolve a new interface for interacting with the TV.

I hence plan of making a gesture based remote control which may be used to control the menu options by a simple flick of your hand. Remote control isn't the costliest part of your home entertainment system and I plan on keeping it that way.This device will be made affordable by using a basic design and adding complexity(simplicity on the user side) through the software part of the implementation. Once done, the user simply has to use his palm gestures over this device to control hi TV. I am so excited just by the very basic implementation (or I am just sick of my remote)




Now the following are the steps that are to be taken in the same order to get to the final product.STEP 1: The first step involves taking four IR pairs and connecting them to ADC channels of a micro controller. The output values of the ADC are to be plotted against time to observe the variation of ADC o/p values as the hand passes over the sensors in different directions at different orientations. The results are to be sent a PC via serial port and the output data plotted appropriately (use MATLAB or PYTHON)


STEP 2: Now once this is done, we need to make decisions on the kind of gestures to be mapped to different remote functions (Usability Matters....Be very observant here...Don't injure the user with acrobatic hand movements).The gestures are to be stored in the form of one dimensional arrays which then become the templates against which gesture motions of the users hand are compared to decide on the IR commands to be sent.

 




STEP 3: Code to implement various IR communication protocols for the various devices to be controlled. This also involves building the actual circuit with the uC. The setting up of Clocks....etc. We may also go for a GUI to setup the various options of the uC via  a serial port or CREATE A WAY TO SETUP THESE THINGS VIA A USB-MAKING MY DEVICE USB COMPLIANT (WORTH A LOOK AS ITS BEEN A WHILE THAT I STARTED SOMETHING POWERFUL).STEP 4: Using machine learning concepts to optimize the user experience, the device should be taught to adapt to the users usage. Trying to avoid false alarms is the key to this design.

Not exactly like this but on similar lines.

Saturday, July 23, 2011


Accelerometer based motion control



'Accelerometer based motion control' is a project that I have been looking forward to for many days. Finally, the robot was built to good level of  satisfaction and I am here to share with you the details of that project. Firstly the  selection of the right accelerometer was crucial as the entire project's success banked on it. The option available were the MMA7260 and the MMA7660. The former one was analog and the latter digital. MMA7660 used and I2C interface to connect with micro controller or any other peripheral devices. But it was too complex as an I2C routine was to be written especially for that micro controller. Hence the easier choice was the MMA7260. The MMA7260 is an accelerometer with an analog interface, giving the orientation outputs as voltages. The outputs were connected to the ADC channels of the accelerometer. An RF module pair was used to send the data wireless to the robot. In the end the robot was moving based on the gestures given by the arm to which the accelerometer was attached.