Hello,
I do not know exactly where to start. First of all, thank you for the great support. I’m pretty new to this robotics topic and do not even know ROS and Co. I have spent the last year a great module with web server and a lot of sensors and bus support including IO control in real time via HTML (BI Directional) via Flask to realize. A great thing. An own camera (USB or RASPI port) in high resolution as well as recording and photo function belong to it. An SDR receiver is also included with server.
I cover a variety of possibilities and can easily expand at any time. Currently, IMU, temperature, air pressure, humidity, air quality can be measured and logged (via XLS export). 16 PWM and 6 digital IO’s, 3 relay ports for high current and one external I2C, one ONE wire port and USB port.
A lot to control, measure and regulate.
“The maximum data bandwidth is 8Mbit / s.” with video at 800x600 resolution. A great picture 1 (12mm lens).
Let’s get to the actual problem.
By the way, I’ve read a lot about ROS. I was interested in the possibility of experimenting with stereo cameras. SLAM for example. A great and exciting thing. First experiment with a Kinect are promising, but connected directly to the laptop. With a video from Youtube I did that once with a Raspberry 3b + and brought the System Master -> Client even to run. The performance was so — goes like that. Various possibilities to leave the data on the Raspberry and evaluate it later have also run. Let’s go to LEO. First, the ROS MAster yes and the WEB UI are running. However, you can not establish a connection externally via a ROS SYSTEM. Error messages (can not find a MAster) are discussed in many forums, but only unsatisfactory solved. Thus, this is unfortunately out. Only a REmote desktop connection on the PI (Husarion interface) works (SSH too). You can start Rviz and CO via the desktop - but with a catastrophic performance. So not to use - and I think this way should not be so - would be illogical. On the Rviz surface it immediately stands out that some nodes are there (Camera) but no node for driving or 2 D MAP Goal (or whatever that means) a direct operation is missing. Maybe that’s what I’m doing wrong. But if there is no node ---- strangely, everything goes via WEB UI.
It’s about so-called mission planning in the future, less about 3D stories. If, for example, a lidar is used to drive autonomously or by coordinates. So the actual ROS features. The aim is yes, so yes, the change to the ponulu motors with encoder.
Well, I do not know any more here.
In short, ROS does not find the master (and the nodes ???). Even via Remote Dektop they do not appear as mentioned.
Ip addresses are tested and SSH and Remote Desktop are also working.
And I do not really understand one thing yet. When driving with LEO (WEB UI), the average data bandwidth is 15 MB / s. Why so much, the live cam only works with 640x480 pixels.