Cloud robotics demo code (now with actual code and links to BBC and FT)

The more technical aspects of Spirit, and carry-over discussion from Kickstarter updates.
headamage
Posts: 29
Joined: Mon Dec 04, 2017 5:27 pm

Cloud robotics demo code (now with actual code and links to BBC and FT)

Postby headamage » Sat Mar 17, 2018 2:38 am

Starting a thread now for some code I am planning to upload next week for a demo organised by King's College London and Ericsson to demonstrate cloud robotics over a prototype 5G network deployment in London.
With that said, what is the prefered method for sharing code with the community?

The code performs the following functions:
Passes all sensor data from arduino to the pi
Pi transmits all sensor data over WiFi to a server (in my case an ubuntu VM)
Server runs logic similar to demo mode 2 and pushes instructions for actions to the Pi (edge detection, range detection, servo moves, pixels)
Pi pushes motor/servo/pixel instructions to arduino/PIC
Server draws real-time graphs of sensor data and motor speeds

Additionally, the Pi can stream live camera feed using webRTC but this is coming from a packaged solution and not code that I developed.
Last edited by headamage on Sun Apr 08, 2018 3:22 pm, edited 2 times in total.

sm7tix
Posts: 31
Joined: Sat Oct 28, 2017 4:21 pm

Re: Cloud robotics demo code

Postby sm7tix » Sat Mar 17, 2018 3:28 am

I am also wondering where we should share code we made and other scripts.
Just now i tried the camera with this bash command

Code: Select all

raspivid -vf -t 0 -l -o tcp://0.0.0.0:3333 #start stream on the rover


And in vlc on the computer

Code: Select all

tcp/h264://<rover ip>:3333
Kind regards

Stefan / SM7TIX

headamage
Posts: 29
Joined: Mon Dec 04, 2017 5:27 pm

Re: Cloud robotics demo code

Postby headamage » Sat Mar 17, 2018 2:55 pm

i recommend this for pi camera streaming
https://www.linux-projects.org/uv4l/installation/

I then put this in /etc/rc.local

Code: Select all

uv4l --auto-video_nr --driver raspicam --encoding mjpeg --custom-sensor-config 5 --framerate 49 --vflip yes hflip yes --exposure fixedfps --video-denoise no  --server-option '--port=9000' --server-option '--admin-password=admin --enable-webrtc-audio=0 --webrtc-enable-dscp yes --webrtc-receive-audio no --webrtc-receive-video no --enable-webrtc-datachannels no --erbrtc-rendered-window 0 0 1280 720'


You can visit your raspi IP on port 9000 and you will get a nice webGUI that allows you to change other settings and view either mjpeg stream or 2-way video/audio with webrtc.
It works really well for my needs since the mjpeg gives very low latency and the webrtc gives very high compression.
Last edited by headamage on Sun Mar 18, 2018 3:36 pm, edited 1 time in total.

headamage
Posts: 29
Joined: Mon Dec 04, 2017 5:27 pm

Re: Cloud robotics demo code

Postby headamage » Sat Mar 17, 2018 3:02 pm

here is a little teaser of the graphs from the surface and range sensor along with motor speeds. I will extend this to include all the sensors on board.
https://photos.app.goo.gl/NCaP5WofUyOtlJQg2

sm7tix
Posts: 31
Joined: Sat Oct 28, 2017 4:21 pm

Re: Cloud robotics demo code

Postby sm7tix » Sat Mar 17, 2018 4:53 pm

headamage wrote:i recommend this for pi camera streaming
https://www.linux-projects.org/uv4l/installation/

I then put this in /etc/rc.local

Code: Select all

uv4l --auto-video_nr --driver raspicam --encoding mjpeg --custom-sensor-config 5 --framerate 49 --vflip yes --exposure fixedfps --video-denoise no  --server-option '--port=9000' --server-option '--admin-password=admin --enable-webrtc-audio=0 --webrtc-enable-dscp yes --webrtc-receive-audio no --webrtc-receive-video no --enable-webrtc-datachannels no --erbrtc-rendered-window 0 0 1280 720'


You can visit your raspi IP on port 9000 and you will get a nice webGUI that allows you to change other settings and view either mjpeg stream or 2-way video/audio with webrtc.
It works really well for my needs since the mjpeg gives very low latency and the webrtc gives very high compression.


Ok! Thanks.
I will try it now.
Kind regards

Stefan / SM7TIX

headamage
Posts: 29
Joined: Mon Dec 04, 2017 5:27 pm

Re: Cloud robotics demo code

Postby headamage » Sat Mar 17, 2018 9:54 pm

https://photos.app.goo.gl/hx9gXN5FmgjjyN6h2
This is the final design of the sensor GUI. It seems to be working fine. The screenshot is with the rover sitting on my desk.
Unless I am missing something, this should cover every environment sensor on the rover. Does anyone know if there are additional sensors hidden?

I just finished coding remote controls over WiFi so I can drive the rover manually if needed. I am currently using WASD on the keyboard to drive it.
I will extend it to include servo movements if I have time before the big day and of course I will share all my code after the event.

User avatar
TomTheWhittler
Posts: 27
Joined: Wed Sep 13, 2017 5:04 am

Re: Cloud robotics demo code

Postby TomTheWhittler » Sun Mar 18, 2018 2:00 am

Thanks for doing this.
Research is the only place in a company where you can continually have failures and still keep your job.
I knew immediately that was where I belonged.

sm7tix
Posts: 31
Joined: Sat Oct 28, 2017 4:21 pm

Re: Cloud robotics demo code

Postby sm7tix » Sun Mar 18, 2018 2:22 pm

headamage wrote:https://photos.app.goo.gl/hx9gXN5FmgjjyN6h2
This is the final design of the sensor GUI. It seems to be working fine. The screenshot is with the rover sitting on my desk.
Unless I am missing something, this should cover every environment sensor on the rover. Does anyone know if there are additional sensors hidden?

I just finished coding remote controls over WiFi so I can drive the rover manually if needed. I am currently using WASD on the keyboard to drive it.
I will extend it to include servo movements if I have time before the big day and of course I will share all my code after the event.

Sounds great!
My rangefinder allways give my static value. Looking very much forward to your code.
Kind regards

Stefan / SM7TIX

headamage
Posts: 29
Joined: Mon Dec 04, 2017 5:27 pm

Re: Cloud robotics demo code

Postby headamage » Sun Mar 18, 2018 3:41 pm

This is quite strange. i had a similar problem with my rangefinder but i managed to get values after I initialised it in arduino.
Here is what my arduino script looks like (i am using the PiControl base sketch)

Code: Select all

*/
#include "Hardware.h"

void setup(){
  // ************ THE SETUP ITEMS BELOW SHOULD GENERALLY BE USED FOR ALL SPIRIT SKETCHES **************
  hardwareBegin();                    //initialize Spirit's Arduino processor to work with his circuitry
  SPI_Reset();                        //resets and turns on the SPI port
  playStartChirp();                   //Play startup chirp and blink eyes
  servoCenters();                     //Place all servos back in default position
 
  // ************ THE SETUP ITEMS BELOW CAN BE CUSTOMIZED FOR YOUR SPECIFIC SPIRIT SKETCH **************
  PIC_SetShutdownDelay(10000,40);
  PIC_RangefinderEnable();
  PIC_SetRangefinderAutoInterval(10);

}

void loop(){
  SPI_Handler();  // looks for incomming SPI data (likely motor control instructions from Pi) 
  // add your own loop code here

}// end of loop() function


Notice the PIC_RangefinderEnable();

That's what did the trick for me. Also highly recommended to increase the shutdown delay to 10 seconds because I found the default 5 seconds is not enough and RasPi was losing power in the middle of its shutdown sequence. The rangefinder interval is optional but I set it to 10 and I haven't experimented with it since.

All in all, the above will simply pass all sensor data to the Pi and then I use the python scripts to retrieve the data and feed them over a UDP socket to my ubuntu VM where I can draw graphs and perform basic movement routines, effectively acting the same way as demo mode 2 of the shipping code, except it is performed over the network. It's quite neat and a great demonstrator of Cloud robotics and how important the network quality is for such applications.

I am currently in the process of adding gripper and servo functions to the manual control so I can manually drive the rover and also control the servos.

sm7tix
Posts: 31
Joined: Sat Oct 28, 2017 4:21 pm

Re: Cloud robotics demo code

Postby sm7tix » Sun Mar 18, 2018 4:53 pm

headamage wrote:This is quite strange. i had a similar problem with my rangefinder but i managed to get values after I initialised it in arduino.
Here is what my arduino script looks like (i am using the PiControl base sketch)

Code: Select all

*/
#include "Hardware.h"

void setup(){
  // ************ THE SETUP ITEMS BELOW SHOULD GENERALLY BE USED FOR ALL SPIRIT SKETCHES **************
  hardwareBegin();                    //initialize Spirit's Arduino processor to work with his circuitry
  SPI_Reset();                        //resets and turns on the SPI port
  playStartChirp();                   //Play startup chirp and blink eyes
  servoCenters();                     //Place all servos back in default position
 
  // ************ THE SETUP ITEMS BELOW CAN BE CUSTOMIZED FOR YOUR SPECIFIC SPIRIT SKETCH **************
  PIC_SetShutdownDelay(10000,40);
  PIC_RangefinderEnable();
  PIC_SetRangefinderAutoInterval(10);

}

void loop(){
  SPI_Handler();  // looks for incomming SPI data (likely motor control instructions from Pi) 
  // add your own loop code here

}// end of loop() function


Notice the PIC_RangefinderEnable();

That's what did the trick for me. Also highly recommended to increase the shutdown delay to 10 seconds because I found the default 5 seconds is not enough and RasPi was losing power in the middle of its shutdown sequence. The rangefinder interval is optional but I set it to 10 and I haven't experimented with it since.

All in all, the above will simply pass all sensor data to the Pi and then I use the python scripts to retrieve the data and feed them over a UDP socket to my ubuntu VM where I can draw graphs and perform basic movement routines, effectively acting the same way as demo mode 2 of the shipping code, except it is performed over the network. It's quite neat and a great demonstrator of Cloud robotics and how important the network quality is for such applications.

I am currently in the process of adding gripper and servo functions to the manual control so I can manually drive the rover and also control the servos.


Worked perfect! You are the man!
Looking forward to use more of your code. I am a Linux geek but newbee on Arduino, and i tying my best on Python :D
Thanks!
Kind regards

Stefan / SM7TIX


Who is online

Users browsing this forum: No registered users and 1 guest