Wednesday, 15 July 2015

Am I really a blogger?

suprised to see that my last update was in Fenruary!

I have now got an Agobe robot which follows a black line, most of the time and also attempts to get out of a maze.
Both of these programs are based on those supplied with the robot.

I have been trying to understand threading in Python, so I have created a program which uses the switch to change between line following, maze solution, quit and shitdown. This seems to work. I have also got an IP display from 4tronix which I use to display the mode. It all seems to work and I must put the code into GitHub , on the off chance someone may be interested.

The robot sometimes does not detect the edge of the line, this could be because of the speed it is moving and the fact I have got a 0.2 second sleep statement in the main loop. Must investigate whether decreasing the time delay improves the performance or maybe I could add a callback on the edge detection.

Then another  idea I have is to use my series 2  b Modesto detect when the ahobo attempts to move out of an area using  Opencv. This would mean having to look at communication between the two PIs.

Friday, 20 February 2015

Get_instance is not working

so it looks like my problems are not anything to do with opencv. Adding some debug statements the code is making two instances of my camera class so the second one can't get access to the camera and hence the error.

Not sure why it is not working, have to do a bit more digging.

Thursday, 19 February 2015

raspberry PI robots

So I have been beavering away for ages now trying to get a robot to follow a line. I have got a model B connected to a BrickPI  controlling a lego robot.
It has been relatively straightforward to control the robot using a Nintendo WII controller.

A while ago I came across opencv and SimpleCV and thought it would be interesting to use the camera to detect masking tape and follow it. Then before Christmas MAGPI included an article on Pietar which recognised traffic symbols. So I have started to implement this code. Although I could get it to work statically, I could not get it to work dynamically because the video picture was not being streamed.

This has led me to look at loads of other approaches, the Dexter streaming robot, CoderBot to name but a few. I have also looked at Flask, in particular Ben Nuttals Bett-Bot proved useful to get some understanding. Then I came across a tutorial from Miguel Grinberg and I have started to use his approach.

In the meantime I have modified my original code to make more use of classes, and the CoderBot approach of using get instance to create only one copy of the class.

My original program based on Pietar now works, I can control the robot from the WII controller and it processes video. When I run everything from My Flask App I get an error
Lib 4l2: error setting pix format: device or resource busy
HIGHGUI ERROR: libv4l unable to IOCTL S_FMT
HIGHGUI ERROR: V4L:initial capture error unable to load initial memory buffers
*** glibc detected *** corrupted double linked list.

The reference to memory buffers made me wonder about memory split, but increasing to 64K has not made a difference.

Tim to investigate further, 128K did not change anything.

Sunday, 8 February 2015

Almost three years later

Well so much for writing something every day!

Quite a lot has happened since my last post of June 2012, still not working although I am still a STEM ambassador, now a code club volunteer running two clubs a week at a local primary and secondary school.

 I also trained as a Code Club Pro volunteer to help train teachers about computers to give them some background for the new computer curriculum. That unfortunately has not taken off in my area, contacted six schools several times and have not acquired any bookings.

I have become addicted to Raspberry PIs with at the moment three of my own, two B's and one B+ with a dead model A. Do not drop things on a raspberry PI, they tend to get squashed!

The main problem I am finding with the raspberry PIs is the number of projects and uses people are coming up with. This means I never get to complete one!

One  PI is sitting waiting to be fitted I to a bird box so I can install it in the garden and take time lapse photographs. It is fixed with the noir camera and a Bright PI.

The second is  mounted on a Dexter Industries BrickPI card and lego robot. I have got the robot working via a Nintendo controller or from the web. I have been looking at opencv and simplecv for camera recognition. There are lots of posts on this, Codor, Pietar and Dexter industries, I keep flitting between each of them. Currently struggling to understand how to send an image back to a web site.

Well, hopefully it won't be three years to the next post.