Other articles


  1. What have I done ?

    Intro

    During these weird time of corona virus crisis, like many, I am working from home and cancelled all social activities. Apart from the distanciation, the virus, the lack of activities, it has been a quite fruitful period as I work a lot on the robot. This articles sums up the latest developments and achievements.

    New functionnalities

    My robot, the Agayon has now kite a lot more features. Among which:

    • Video streaming nearly in real time. The stream can be viewed in a web browser. It works quite well and has been used across the internet. Some friends or family members have visited me with the help of the robot. The only drawback is that I have to limit the stream to 10 frames per second. The setup is based on mjpg-streamer and the custom layout is available on his dedicated Gitlab repository.
    • The Agayon can now be remotely controlled with
      • A PS4 controller (Bluetooth)
      • REST API to control the robot.
      • A web interface that uses the API (see screenshot below)
    • Lidar mapping: It can be triggered by the PS4 controller, the web interface or XMPP. I use a RPLidar A1M8. Data is saved in a file that can be analyzed afterward. No real time data processing for now.
    • XMPP: migration from the deprecated Sleekxmpp library to the more modern one Slixmpp.

    Changelogs

    Python (Raspberry Pi)

    The code is available on my gitlab account.

    • Small turn angles for remote control. The robot was sometime too much brutal during rotation. This version adds the support for 'gentle turns'. During the new gentle turns, only one wheel is moving. The 'normal turn' remains and it rotates both wheels in opposite direction.
    • Start and Stop Webcam streaming from socket. It relies on the restart_stream.sh and stop_stream.sh scripts. It is used from the web interface and XMPP.
    • Mapping: simple scan and logging of all mapping data. It Saves a picture (polar graph) for each snapshot.
    • Arduino communication: serial data to get the ultrasonic sensors measurements.
    • The rover can be remotely controlled with a PS4 controller. Events are caught and instructions are sent through serial communication to Arduino.
    • Some refactoring and cleaning.
    • A lot of bug fix.

    Arduino

    The code is available on the gitlab repository.

    • I created a small Serial manager to handle orders from the RPI. It can be used to remotely control the motors, get ultrasonic measurements, change speed, change mode (incremental or directly to a mode number), capacity to turn a little (small angles but not that small )
    • Send formatted serial data that can be easily parsed by the RPI process (odometry).
    • A lot of bug fix

    Misc

    The web remote control is based on a small webservice depending on with the Flask framework. It is served with uWSGI with the help of Nginx. The sources are on the Gitab related repository. It is used to control the robot with a web interface but any client able to use a web API can use it. Maybe an Android client will follow?

    When the video streaming service is unavailable, a 503 error is displayed with a custom page.

    503 error

    I use it to launch the Webcam streaming directly from the navigator.

    All the configuration, HTML pages, scripts and config files are available in a separate repository.

    Once mjpg_streamer is launched, it serves a small page to interact with the API.

    This picture was taken at the beginning of a ~4m narrow hallway.

    Hallway

    Pictures

    Here are some pictures token in my apartment. As long as no Lidar measurment has been made, a cartoonish picture of the robot is displayed. As the robot is going forward, the mapping is updated by clicking on the Lidar camera button. For unknown reason, the left/right mapping is inverted in the picture.

    Start

    Start Lidar Start Lidar

    Middle

    Middle Lidar

    End

    End Lidar

    Future

    There are still a lot of room for improvements. Here are some ideas to occupy me in the future:

    • Automatically save a camera picture when the lidar mapping is triggered.
    • Detect the kernel messages about battery. When the voltage is too low, the information is logged by the kernel and can be found with systemd. The idea would be to shut down the robot when necessary to avoid SD card memory corruption.
    • Apply some nice OpenCV filter to obtain a transformed video stream line in the movie Terminator. :-)
    • Use one button to trigger the video recording. Use a blinking LED to let known that it is "On Air".
    • ...

    Recording movies with the camera

    Recording a movie with OpenCV can be done in less than 25 python lines. It works well with my old Logitech C170 but for some reasons it did not work out of the box with my Microsoft LifeCam Studio. Dear visitor, if you have such a camera, here the magic trick to produce your new feature movie ! First, make sure to have this model with the lsudb command:

    Bus 001 Device 007: ID 045e:0811 Microsoft Corp. Microsoft® LifeCam Studio(TM)
    

    The camera produce naturally a MJPG stream. I would sum it up a stream of JPEG pictures. By default, the VideoWriter class will produce empty video files. Unfortunately, by default there are no debug message to help you. Don't forget to use the following environment variables during your hacking sessions:

    OPENCV_VIDEOIO_DEBUG=1 
    OPENCV_LOG_LEVEL=verbose  
    

    To make it work, you need to tune the camera settings before obtaining the desired result. The complete program is displayed here. Follow the comments to see where the magic happens.

    import cv2
    import os
    
    cap = cv2.VideoCapture(0)
    if not cap.isOpened():
        raise IOError("Cannot open webcam")
    
    # Magic number corresponding to a MJPG Stream
    codec = 0x47504A4D  
    # You actually says that your camera produces such a stream
    cap.set(cv2.CAP_PROP_FOURCC, codec) 
    # We fix the resolution, the framerate and use exactly the same in the VideoWriter arguments    
    cap.set(cv2.CAP_PROP_FRAME_WIDTH, 640)  
    cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)
    # The framerate is not limitating here
    cap.set(cv2.CAP_PROP_FPS, 20.0)     
    
    # next we define the filename, the writer options and start the infinite loop.
    videoname = os.path.join('/tmp/', f"output.avi")
    fourcc = cv2.VideoWriter_fourcc(*'MJPG')
    video_writer = cv2.VideoWriter(videoname, fourcc, 20.0, (640, 480))
    
    while True:
        ret, frame = cap.read()
        video_writer.write(frame)
        # When the Q key is pressed, the loop is stopped
        if cv2.waitKey(1) & 0xFF == 27:
            break
    
    cap.release()
    video_writer.release()
    cv2.destroyAllWindows()
    

    Webserver configuration

    You can find here the webserver and uWSGI configurations used to make it work.

    Nginx

    upstream mjpeg {
     server 127.0.0.1:8090;
    }
    
    server {
      listen 443 ssl;
      server_name namek.agayon.netlib.re;
      location / {
      proxy_redirect off;
      proxy_pass http://mjpeg;
      index index.html;
      }
      location /api {
        include uwsgi_params;
        uwsgi_pass 127.0.0.1:3031;
        proxy_read_timeout 300s;
      }
    
    
      location /static/ {
        alias /srv/http/ngnix/r1d3/public_html/static/;
        try_files $uri $uri/ /static/lost_bot.jpg;
      }
    
       ssl_certificate     /etc/letsencrypt/agayon.netlib.re_fullchain.pem;
       ssl_certificate_key /etc/letsencrypt/agayon.netlib.re.key;
       ssl_session_timeout 1d;
       access_log /var/log/nginx/r1d3.access.log;
       error_log /var/log/nginx/r1d3.error.log;
       error_page 500 502 503 504 /500.html;
       location = /500.html {
           root   /srv/http/ngnix/r1d3/errors;
           allow all;
           internal;
       }
       proxy_read_timeout 720s;
       proxy_connect_timeout 720s;
       proxy_send_timeout 720s;
       proxy_set_header X-Forwarded-Host $host;
       proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
       proxy_set_header X-Forwarded-Proto $scheme;
       proxy_set_header X-Real-IP $remote_addr;
    }
    

    uWSGI

    [uwsgi]
    chdir = /srv/http/ngnix/api/rest_api/
    
    processes = 4
    threads = 2
    plugin = python
    virtualenv = /srv/http/ngnix/api/rest_api/venv
    
    module = api_agayon.agayon_app:app
    callable = app
    wsgi-file = wsgi.py
    master = true
    socket = 127.0.0.1:3031
    
    ;route-uri = ^/api/(.*) rewrite:/$1
    vacuum = true
    
    die-on-term = true
    kill-on-idle = true
    
    wsgi-file = wsgi.py
    
    stats = 127.0.0.1:9191
    buffer-size=32768
    

    Links

    Continue reading
  2. A remotely controlled mapping device

    Published: Sun 05 January 2020
    Updated: Sun 05 January 2020 By Arnaud In Agayon. tags: agayon

    During the Christmas holidays, I took the time to work on the Agayon. I hope that the mechanical parts are almost finished and I will be able to focus on the code in the following weeks/months.

    Do as you’re told

    A few years ago, I have been gifted with a PS4 controller to play on my Retropie Setup. These are quality controllers but Sony does not like makers. My model cannot be used with vanilla bluetooth drivers on my Pie. And for unknown reasons, the alternative driver ds4drv does not works for me. I purchased a 5m USB mini cable to not disappoint my lovely niece and nephew. They can plan with Supertuxkart for hours. But it's not the subject of this article. I plan to use the Agayon not only in my apartment but also to go on tour with it ! (let's dream a bit). Unfortunately, it is already quite heavy and it could be cool to drive it from my door to the car. I could dream again and imagine a system where it recognize me after a little bit of training like a dog or a case. It seems possible but why not use the controller at first. It should be easier and quick to implement. Moreover, the kids and friends seems to love the idea. So let's do it ! I will keep you updated when it is reliable and easy to use.

    Back to the future remote Photo credit: Back to the Future (1985)

    Re-verify our range to target... one ping only

    I have finally mounted the Lidar on the Agayon. The Raspberry Pi 2 was too slow to handle the data but the Pi 4 does well the job.

    Mounted lidar

    My living room

    The following animation has been made with the animate.py script from the RPLIDAR repository. lidar animation polar coordinates

    Streaming

    Last year I discovered the Raznot project and it inspired me. According to the readme, the RazTot is an easy DIY project which allows you to remotely control a roving security camera securely from your browser. After some tests, I decided to not use this project because the flask server and interface would not so nicely integrate with my R1D3 base code. I only use Janus Janus, a general purpose WebRTC server to stream from the robot in a generic web page for now. Unfortunately, my Microsoft LifeCam Studio does not produce a stream compatible with Janus. I need to transcode the MPEG-4 video stream to H264 with ffmpeg to see it in a browser. I hope to be able to release the code in the following weeks.

    More to come, stay tuned !

    Continue reading
  3. The rise of the machine

    Hurrah !

    After months of work on the Agayon, I can present some significant improvements ! This article is a little bit longer than the previous ones but it worth the read!

    Software updates !

    During the past few weeks, the code base of the Agayon has been updated. I forked my own project, r1d2 to update it. The new repository is named r1d3. I hesitated a long time before forking it. As the hardware base of the Agayon completely changed, I preferred to change the code name to maintain coherence between hardware and software.

    The update aims to provide

    • Python 3 support only
    • OpenCV 4 support for
      • Face recognition
      • Sign tracking
      • Face/hand detection and tracking
    • Better XMPP ad hoc support
    • I2C support
    • Hardware switches support.

    XMPP Ad hoc commands

    As I made tests with SLeekXMPP to control the bot, I observed some problems with Gajim. The Ad-Hoc extension allows one to send commands to an XMPP bot. R1D3 displays the following menus and submenus (in french):

    menu1 menu2 menu3

    When I try to use the "execute" button, SleekXMPP start a new session and Gajim complains that the session identifier has changed. I reported the problem to SleekXMPP and its fork SliXMPP. The XMPP community is great and Maxime Buquet responded quickly. To quote him, there are two problems (see the bug report for the whole explanations):

    • Slixmpp shouldn't assume execute is the start of a command
    • I don't see a place in the XEP that says that next or execute can be equivalent to complete. What to do?

    He sent an email on the "Standards" mailing list and some responses followed. It seems difficult to fix the protocol at the moment without breaking compatibility. Maxime proposed a patch to fix Slixmpp and it should work on SleekXMPP. For now, I just don't use the "Execute" button as "Forward" does the job. The depreciation of the "Execute action" is actually discussed.

    New hardware !

    The Agayon has now 8 LEDs and 6 switches. They are placed on a control panel.

    The LEDs aim to provide status information

    • 5V power (orange)
    • Pi powered up (green)
    • I2C on the Arduino (green)
    • I2C + serial on the Pi (green)
    • Serial communication (green, Arduino)
    • Video capture (red)
    • Internet connection (blue)
    • LIDAR mapping (red)

    The switches aim to provide

    • Power on (12V) (Ebay)
    • Start R1D3 (MCHobby)
    • On/Off Demo mode (Arduino) (Ebay)
    • On/Off Power down (Pi) (Ebay
    • Emergency stop (cut Arduino power) (Ebay)
    • Movie recording (Pi)

    In addition, the following hardware are also mounted to provide information and input/output. I2C addresses are displayed (0Xxx)

    The documentation of the SHT71 explains why the sensor has no I2C adress.

    The serial interface of the SHT7x is optimized for sensor readout and effective power consumption. The sensor cannot be addressed by I2C protocol, however, the sensor can be connected to an I2C bus without interference with other devices connected to the bus. Microcontroller must switch between protocols.

    One ground to rule them all

    I have been advised to use an epoxy base coated with a copper layer. The aim is to connect it to the negative pole of the battery. It is really useful because it decrease the wiring. The perfboards are fixed on metallics spacer bars to avoid shortcuts.

    plate1

    I2C Scans

    I²C is a bus communication that allows multiple device to communicate with each other.

    I2C devices are recognized by the Arduino (5V) and the Raspberry PI (3.3V) with the help of a level shifter.

    I've used the I2C scanner provided by the Arduino documentation.

    Arduino

    Scanning...
    I2C device found at address 0x1D  !
    I2C device found at address 0x20  !
    I2C device found at address 0x6B  !
    I2C device found at address 0x70  !
    done
    

    Raspberry PI

    user$ i2cdetect -y 1
         0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f
    00:          -- -- -- -- -- -- -- -- -- -- -- -- --
    10: -- -- -- -- -- -- -- -- -- -- -- -- -- 1d -- --
    20: 20 -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
    60: -- -- -- -- -- -- -- -- -- -- -- 6b -- -- -- --
    70: 70 -- -- -- -- -- -- --
    

    Gallery

    "Scaffolding" with hot glue

    During the past few months, my best friend has been my hot glue gun. I was skeptical at first but really much effective and fun. I used it to insulate some connectors. In Liège, we would say "mettre une noquette de colle" which translates to "put a knob of glue".

    glue1 glue3

    capot2 capot3

    Evolution of the frame

    full_base connection_base_capot

    complete_1

    complete_3 complete_4

    • A : Battery
    • B : Level shifter between Arduino (5V) and Raspberry Pi (3.3V)
    • C : Arduino Mega
    • D : Power lines and I2C (12V, 5 V, 3.3V, SDA 5V, SCL 5V, SDA 3.3V, SCL 3.3V)
    • E : Raspberry Pi (in his case)
    • F : Buttons and their pull down (3.3V or 5V depending on the GPIO)
    • G : LEDs

    New (old) Oscilloscope

    One of my colleague has been cleaning his lab, and he asked me if I was interested to have an old 20 MHz oscilloscope. I gladly accepted. It is a 34 years old Circuitmate 9020 (bought in 1985). I will use it for I2C debugging and visualization. oscillo

    Conclusion

    The hardware is almost done. I am happy to have a nice reliable base. I hope to be able to drive it with my smartphone soon. I will continue the programming to add the mapping functionality and a nice demo mode.

    Stay tuned !

    smile

    Continue reading

links

social