First Commit

This commit is contained in:
2024-12-02 15:11:30 +01:00
commit 031f6004de
4688 changed files with 441558 additions and 0 deletions

View File

@@ -0,0 +1,12 @@
![[Pasted image 20240130103300.png]]
> btw, i suspect that the coordconv at the first layer might not be necessary (coord0)
> at the first layer the CNN just does low level feature dedection (detecting edges and so on)
> the coordconv are only helpful for higher level analysis deeper in the model
> at least that's my theory
> also, do i see correctly that after the first layer, there's 1 maxpool that makes it 4x smaller? from 2048x1024 to 512x256?
> or does the first layer use stride 2, and the maxpool only makes it 2x smaller?
> overall the architecture looks good :+1:
> - Comments [[Floris]]

View File

@@ -0,0 +1,13 @@
A loom video instruction can be found [here](https://www.loom.com/share/f6460a1200fe499eb3aae941c8bd8ca8?sid=49451325-5d3a-4953-8e24-7007af44a99e).
# Preparation
1. Decide which image you want to flash to the cm4. All images can be found on the [google drive](https://drive.google.com/drive/folders/1SCE0VCfHIKdJlZBVMUI5AOd8uqdrAA_7) and usually you can find an overview on the [CM4 Images Overview sheet](https://docs.google.com/spreadsheets/d/1hWfCUwhnYKjQz526QUwpVui-eFWYgPfyoRJ40pwvJD8/edit#gid=0). For this example I chose the image `cm16_240104.img.xz`.
2. Decide which physical cm4 board you want to flash the image to. A list of all cm4's that we own can be found in the [CM4 overview sheet](https://docs.google.com/spreadsheets/d/18PtsqKYUrfK0cAAFBfwtgTSo0JnJ0n5Dc3oIjpWry6I/edit#gid=0). For this example we chose the cm4 with ID 14.
1. Be aware that there are two versions of cm4s. The lite version requires a SD-card whereas the EMMC version does not. The flashing instructions are different for the two.
2. The cm4s also come with different RAM and EMMC storage. You can visually inspect the part numbers if you are unsure. E.g. [RAM](https://semiconductor.samsung.com/dram/lpddr/lpddr4/k4f6e304hb-mgcj/), or [EMMC](https://semiconductor.samsung.com/estorage/emmc/emmc-5-1/klmbg2jetd-b041/).
# Flashing
If you have the EMMC version follow the instructions here: https://onesecinc.atlassian.net/wiki/spaces/MC1/pages/35029029/Flash+CM4+EMMC+with+a+new+Image
If you flash a Lite version, plug the SD-card into your development computer and then follow the instructions starting from point 5.
# UID
# Testing

View File

@@ -0,0 +1,5 @@
# Troubleshooting
## Really slow internet
I realized that the [[git]] commands `git fetch` or `git push` were extremely slow (several minutes). And it was because of a slow internet connection. The ping of google.com was like 64000 ms or so. The problem in the end was that the cm4 was using the `wwan0` interface (which is our LTE connection) but the antenna was not connected. I was surprised that it even found a network at all.
The solution was to either connect the antenna, or to take the network down with
`sudo ip link set wwan0 down`.

View File

@@ -0,0 +1,80 @@
In order to finish my job at the end of april, I need to document all the different projects I have worked on and make sure that the relevant information can be accessed without my input.
# Tooling
## OneSecServer
### Foxglove Studio on Server
### Flight Log on Server
## Teststand
## Wind Analysis
## Project Management on git
## 3D Printing
### FDM
### SLA
### SLS
## CNC Machining
# Software Development
## OSD-Autopilot
### Foxglove Bridge / ROSBridge
### GNSS Driver
### LTE Driver
### Bridge
#### Build Problems
#### RTPS vs DDS
### Hooklanding
### NAV2 & Behavior Trees
### Table Follower
## PX4-Autopilot
### Nuttshell and how to develop and debug
see
- [[Nuttshell - NSH]]
- [[Custom uORB Message]]
- [[PX4 Build Instructions]]
### V1.14 and Control Allocation
### OSD Payload Controller
### Magnetic Sensor Drivers
#### Magnetic Encoder Baseclass
#### ASR012
#### AS5047P
### INDI-Controller
### Gazebo Simulation
#### Virtual Swashplate Model
## Automated Testing
## Ubuntu Image
### How to install ROS2
### How to install Arducam Drivers
### How to install Hailo Drivers
# Hardware Development
## Drone Assembly Diagram
## Virtual Swashplate Development
## PCB Development
### Development Process
#### Automated Drive Documents
# Field Testing
## Drone Assembly
## Log Analysis
### OSD Flight Review
### Plotjuggler
## OSD-Autopilot Test Procedure
### How to Run Table Follower
### How to Run the Hooklanding
# Future Planning
A brief overview of things I think are important for the next year or so (similar to the large software list I have shared in december)

View File

@@ -0,0 +1,6 @@
# Unchanged from V2
- temp and humidity sensor
- 3V3 LDO:
- only consumers changed, but is remaining below limit.
- led drivers changed only very little

View File

@@ -0,0 +1,8 @@
[This article](https://opensource.com/article/18/6/embedded-linux-build-tools) summarizes different build tools to get the embedded application working. Currently, we use the last of the four options described: a stripped down desktop distribution.
Also [this video](https://www.youtube.com/watch?v=9vsu67uMcko&ab_channel=DigiKey) by DigiKey series goes into those tools in more detail, mostly yocto.
# Yocto
# Buildroot
# OpenWRT / LEDE

View File

@@ -0,0 +1,3 @@
- [ ] Is SIM-card inserted and correctly oriented?
- [ ] Are Antennas connected securely (GNSS, LTE, LTE Diversity)
- [ ]

View File

@@ -0,0 +1,4 @@
# Assembly
- [ ] Antenna cables need to be fixed to the same frame as the carrierboard, or the cables should be long enough to not be too tight to get ripped out
- [ ] Gnss and Lte should just work without issues and if there is a problem there should be an indicator like a led or a sound (e.g. through motors and px4)
- [ ]

View File

@@ -0,0 +1,5 @@
The Webapp is hosted on our [[OneSec Server#PX4 - Flight Review|VPS server]]. The credentials are:
- user: claudio
- pw: CesOne8143_
# Add a New Log
![[Pasted image 20240320120530.png]]

View File

@@ -0,0 +1,64 @@
We host the server at [netcup - Germany](https://www.netcup.eu/), a company that offers VPS. The IP address of our server is 37.120.177.0.
# Organisation
## Access
The following is the access data to access the server through ssh (part of my `~/.ssh/config` file):
```bash
Host netcup
HostName v2202204173997187481.hotsrv.de
User root
IdentityFile ~/.ssh/netcup_osd_claudio
```
## Setup
Vinnie, Dannick's Friend, helped me with the setup. I leave the bash history and the zsh history files here as a reference.
![[bash_history_netcup]]
![[zsh_history_netcup]]
I have installed `zsh` and `oh-my-zsh` to make the interaction easier
in the cli-history files above you can see how the project got updated, but basically update the project files on a local machine, test everything, push it to gitlab. Then access the server, pull the changes, checkout to the correct version and restart the docker container:
```zsh
docker compose -f docker-compose.dev.yml down docker compose -f docker-compose.dev.yml up -d --build --force-recreate
```
## Structure
The main folder is called osd_apps, which hosts the web-apps that we're running.
We use [[nginx]] as a reverse proxy to serve the different web-apps and [[Docker]] to run the apps in an appropriate container.
We also use certbot (installed through snap) to keep the certifiates up to date.
The [[OneSec - FlightReview|flight review app]] is hosted at the website [testlogs.onesec.com](testlogs.onesec.com). The authentication is done through a nginx configuration. The credentials can be found on the specified article.
### 6tunnel
[6Tunnel](https://github.com/wojtekka/6tunnel) is used to create a translation service for Dannicks NAS. Basically the server (which has a static IPV4 address) forwards requests to a IPV6-address (Dannicks NAS). See history files above for the commands. And make sure that the firewall (`ufw`) allows the correct ports (32400 in Dannicks case).
### Nginx
Nginx is used as a reverse proxy server, that checks incoming traffic and routes it to the correct interface on the server. We can configure it, such that any traffic coming from a certain URI (e.g. testlogs.onesec.com) is rerouted to the localhost:5006. And on this interface we are running a docker container, which hosts the web-app. This means that any traffic coming from testlogs.onesec.com is directed towards the web-app and any other traffic (from the IP for instance) is routed to `/var/www/http` and a static content is served (default nginx stuff). If another app is added we can have an additional port for that as explained in [this article](https://www.digitalocean.com/community/questions/how-to-host-multiple-docker-containers-on-a-single-droplet-with-nginx-reverse-proxy).
### Certbot
Certbot is a tool that manages certificates for our webserver. It has a really good integration with nginx. Some more info can be found [here](https://sirfitz.medium.com/setting-up-an-nginx-instance-with-certbot-and-configuring-it-for-wildcard-subdomains-on-ubuntu-96e413281a99). Especially the cron-job for the autorenewal is handy. But cerbot also has a automated way of handling this, as can be seen in the [instruction set by digital ocean](https://www.digitalocean.com/community/tutorials/how-to-secure-nginx-with-let-s-encrypt-on-ubuntu-20-04). Compared to the instructions on the blog article, on our server we've used the following command:
```bash
sudo certbot --nginx -d testlogs.onesec.com
```
### Docker
[[Docker]] is used to isolate multiple web-apps that are running. A few important things are listed here:
- use the `-d` or `--detach` option to let it run in the background even if the terminal is disconnected. This ensures that the webapp is running when the setup process is finished.
# WebApps
## PX4 - Flight Review
A [customized version](https://gitlab.com/onesecdelivery/flight_review_osd) of the official PX4 [flight review](https://github.com/PX4/flight_review) app is hosted there.
# How to add a new WebApp?
1. Go to namecheap.com, login and [add a new A-record.](https://www.namecheap.com/support/knowledgebase/article.aspx/319/2237/how-can-i-set-up-an-a-address-record-for-my-domain/)![[Pasted image 20240123183014.png]]Here you can see that for the subdomain foxglove and testlogs we have an A-Record [^1] to forward to our VPS server (37.120.177.0). Once this is done all traffic gets forwarded to our VPS server where we need to handle it.
2. Access the servers command line through SSH (see instructions above).
3.
# Footnotes
[^1]: An A record **maps a domain name to the IP address (Version 4) of the computer hosting the domain**. An A record uses a domain name to find the IP address of a computer connected to the internet. The A in A record stands for Address.

View File

@@ -0,0 +1,15 @@
Since the entire autopilot is heavily inspired by the [[ROS2 - NAV2 Library]] we also use [[Behaviour Trees]] to implement the autopilot.
All of our simple behaviors are located in the nav3_behavior_tree package under the plugins folder. There are [four types](https://www.behaviortree.dev/docs/learn-the-basics/main_concepts#create-custom-nodes-with-inheritance) of plugins that form the base of behavior trees as basic building blocks.: action, condition, control and decorators.
1. Decide what kind of node you will be adding: we're assuming a control node for the sake of the example
2. write your h-file in *nav3_behavior_tree/include/nav3_behavior_tree/plugins/control*.
1. As an example
2. Inherit from *BT::ControlNode*. (See round_robin_node.hpp as an example)
3. Write the cpp-file in *nav3_behavior_tree/plugins/control/round_robin_node.cpp*
4. Make sure you use the `BT_REGISTER_NODES()` macro to register the node in the behavior tree factory.
4. Add the plugin to the *CMakeLists.txt: nav3_behavior_tree/CMakeLists.txt*
5. Write a test in the following file: *nav3_behavior_tree/test/plugins/control/test_round_robin_node.cpp*
6. Register the test in the following file: *nav3_behavior_tree/test/plugins/control/CMakeLists.txt*
1. Make sure you test all cases that the node uses.
7. Add the new tree node to the library file: *nav3_behavior_tree/nav3_tree_nodes.xml*

View File

@@ -0,0 +1,19 @@
Nav3 is our custom implementation of a navigation library. It is heavily inspired by [[ROS2 - NAV2 Library|NAV2]] which is a standard [[ROS2]] library.
By heavily inspired I mean it is basically copied, and modified.
# Differences
## Costmap_3d
The `nav3_costmap_3d` package is completely rewritten. The only similarity to the original `nav2_costmap_2d` package is that we're using the same API to remain compatible with the rest of the `nav2` library.
Our own costmap_3d implementation uses a fruxel grid instead of voxels.
## Planners
Several planner plugins are missing for now.
## Controllers
Several controller plugins are missing for now.
## Nav3_util
- Added `calculate_linear_speed(Twist)` which is a central util function to calculate speed

View File

@@ -0,0 +1,12 @@
The purpose of using [[Docker]] this is to have a containerized build of the autopilot. This means that any employee can easily get started with development by cloning the vscode-docker-dev repo and open the repo in a containerized mode. This drastically reduces onboarding time and also ensures that everyone is working on the exact same system.
There are a few inspirations to how I want to set this up:
1. Keep in mind that the final code needs to run on the raspberry pi cm4 module, without a docker container because we need hardware access.
2. The containerization can be built in several layers:
1. ros2 production (no simulation tools, this is the closest we get to the rasperry pi --> exports a shell script to install on the rasperry pi.)
2. simulation stack: includes px4-osd that runs the current px4-code and the px4 simulation running in gazebo as well as ros2 to do development
3. testing stack: same as previous stack but includes integration tests: missions to accomplish and writes automated reports if the flights were successful or not (could also upload to a log website)
This is going to be really helpful in the future, because with every commit merge to the main branch, unit-tests, functional tests and integration tests need to pass before a merge can happen.
PX4 already does [extensive testing](https://github.com/PX4/PX4-Autopilot/tree/main/test) within a gazebo simulation. It also includes more elaborated [integration tests](https://github.com/PX4/PX4-Autopilot/tree/main/integrationtests/python_src/px4_it).

View File

@@ -0,0 +1,5 @@
## How to run
1. We need a transform between map and odom: `os2 run tf2_ros static_transform_publisher 0 0 0 0 0 0 1 map odom`
2. We need the px4 simulation running: `source ~/ros2_ws/install/setup.zsh && cd ~/px4-osd && make px4_sitl_default gazebo_osd_mono_copter`
3. We need the RTPS bridge running: `/home/claudio/ros2_ws/install/px4_ros_com/bin/micrortps_agent -t UDP`
4.

View File

@@ -0,0 +1,23 @@
# PX4_MSGS
The [PX4_msgs](https://github.com/PX4/px4_msgs) package is required, because it translates [[Custom uORB Message|uORB]] messages into the ROS2 ecosystem and vice versa. The project contains a `msg` folder, which contains the msg definitions. Usually, those message definitions are autogenerated by [a script](https://github.com/PX4/PX4-Autopilot/blob/v1.13.3/msg/tools/uorb_to_ros_msgs.py) provided by the PX4-Autopilot project.
A build error that has popped up several times is that the messsage definition comments contain `*/` which is misinterpreted by the compiler
Error message:
``` bash
/home/pi/ros2_ws/build/px4_msgs/rosidl_generator_c/px4_msgs/msg/detail/gimbal_v1_command__struct.h:356:3: error: expected identifier or ( before / token
356 | */
| ^
/home/pi/ros2_ws/build/px4_msgs/rosidl_generator_c/px4_msgs/msg/detail/gimbal_v1_command__struct.h:365:3: error: expected identifier or ( before / token
365 | */
```
In order to solve this error, just delete the relevant parts of the comment.
# px4_ros_com
In order to successfully build the `px4_ros_com` package you need to define the following env-variable: `export FASTRTPSGEN_DIR="/usr/local/bin/"`.
Without it it will not work and throw the following error:
```bash
c++: error: /home/pi/ros2_ws/build/px4_ros_com/src/micrortps_agent/microRTPS_agent.cpp: No such file or directory
c++: fatal error: no input files
```

View File

@@ -0,0 +1,36 @@
# Hardware
## Pinout
![[Pasted image 20240229144513.png]]
![[Pasted image 20240229144518.png]]
# Software
1. compile the correct firmware and flash it to the flight controller
2. Setup the correct airframe and calibrate the controller
3. import the last px4-fmuv5 config files and check the correct settings
4. check every peripheral independently.
1. Optical flow
2. pwm outputs (virt swash and servos)
3. rc
4. i2c --> angle sensor
5. telem 1 --> mavlink router
6. telem 2 --> micrortps bridge
7. USB --> flashing, debugging, etc.
## Virtual Swashplate Calibration
Requirements to fulfill before starting the calibration routine:
- Make sure that the motors turn in the correct direction, else swap 2 ESC-Motor cables
- Do this calibration routine with a battery and do not use a power supply. The power supply would limit the current and thus interfere with the swashplate modulation, falsifying all results.
- Wear a bike helmet for protection if you do the manual calibration
1. Set the PARAM1 Tuning channel to a nice continuous potentiometer type knob on your RC: ![[Pasted image 20240304172822.png]]
2. Prepare Parameters to do the calibration routine:
1. Set all `MC_*` parameters to have feedforward only control: Meaning force saving PID values to 0 for all axes (roll, pitch and yaw) and setting the FF parameter to 0.15: ![[Pasted image 20240304173415.png]]
2. Set the drone to ACRO mode
3. Calibrate every virtual swashplate individually (upper and lower, explained for the upper swashplate with ID=0). Repeat for ASR012_VS_EN values for: upper:1, lower: 2, both: 3
1. Enable only the upper swashplate under Q->ASR012->ASR012_VS_EN=1
2. Apply the RC_TO_PARAM option for the Upper Swash Offset parameter (ASR012_0_OFFSET)![[Pasted image 20240304172646.png]]
3. Wear the bike helmet
4. Hold drone in your hand and pitch forward. Turn the knob until the pitch/roll directions you feel correspond to what you command with the stick.
5. Reset RC-To-Param: Tools->Clear All RC to Param

View File

@@ -0,0 +1,43 @@
uORB message are the equivalent of [[ROS2]] messages in the PX4 messaging system. uORB does the same job as the ROS2 middleware system by sharing messages in a public and subscribe system across different modules. A good overview on how uORB works and how to customize it look at the dev-series on [this blog](https://px4.io/px4-uorb-explained-part-1/).
In order to add a custom uORB message that can be used to share information between modules and allow to log data we need to do the following steps:
- add the message definition to `PX4-Autopilot/msg/custom_msg.msg`
- add the timestamp field in the first line, since it is mandatory:
`uint64 timestamp`
- add the message file path to `PX4-Autopilot/msg/CMakeLists.txt` in order to compile the message.
If the message should be used in several independent topics we can add the following line to the message definition:
`# TOPICS topic_name_1 topic_name_2 topic_name_3`
# ROS2 Compatibility
Starting from PX4 v1.14 ROS2 compatibility is native (see [this release article](https://px4.io/px4-autopilot-release-v1-14-what-you-need-to-know)).
In order to use the new uORB message with the ROS2 system on the companion computer you need to enable it on the bridge. In order to do this there are 4 steps required as explained in [this thread](https://github.com/PX4/px4_msgs/issues/7):
1. create the uORB `.msg` file under `PX4-Autopilot/msg/`
2. Generate the ROS `.msg` using the `uorb_to_ros_msgs.py` script and copy it to `px4_msgs` under you colcon/ament workspace
3. Do the changes on both `yaml` files in `PX4-Autopilot/msg/tools` and `px4_ros_com/templates` as you did above
4. Build your colcon/ament workspace with both updated `px4_ros_com` and `px4_msgs`
## Bridge Message Yaml File
The `.yaml` file is located in two locations: under `PX4-Autopilot/msg/tools/` and in `px4_ros_com/templates/` and it needs to be the correct version in the two locations in order for the bridge to work seamlessly. Below you can find a part of such a file ([px4_ros_com](https://github.com/PX4/px4_ros_com/blob/release/1.13/templates/urtps_bridge_topics.yaml) version):
```yaml
- msg: TelemetryStatus #9
receive: true
- msg: Timesync #10
receive: true
send: true
- msg: TrajectoryWaypoint #11
send: true
- msg: VehicleCommand #12
receive: true
- msg: VehicleControlMode #13
send: true
- msg: VehicleLocalPositionSetpoint #14
receive: true
- base: VehicleLocalPositionSetpoint #15
msg: TrajectorySetpoint
receive: true
```
The option `receive: true` means that the bridge shares the messages from the companion computer to the PX4-flight controller, whereas the option `send: true` means that the PX4-flight controller sends the messages to the companion computer. If a two-way communication is desired you can add both options, just like in the example above (`Timesync`).
The PX4-Autopilot version uses the uORB message names (lowercase and underscore), whereas the px4_ros_com version uses the default ROS2 standard for the message definition (CamelCase).
It is **extremely** important that the yaml file definition of those messages matches, else the bridge won't work properly. Specifically, the order of the messages as well. Because in the background, the messages are shared over their id (indicated as a comment in the excerpt above).

View File

@@ -0,0 +1,7 @@
We want to do this in order to control a peripheral actuator such as a winch or a payloadbay.
# PX4 v1.13 and below
At the time of writing we use a modified version of the PX4 v1.13.3, which still uses the (now deprecated) mixing system to allocate control to the different actuators. Unfortunately this makes it quite cumbersome to control. I added a hack to the system, by using the [[Mavlink]] messagePWM
# PX4 v1.14 and above
The new tight [[ROS2]] integration will allow to control peripheral actuators as explained in [this video](https://youtu.be/3zRCIsq_MCE?t=1020).

View File

@@ -0,0 +1,23 @@
The nuttshell is the main shell used in the px4-environment. It can be used to [interactively work with a px4 flight controller](https://docs.px4.io/main/en/debug/consoles.html) to develop and debug. It can be accessed through [QGroundControl](https://docs.px4.io/main/en/debug/mavlink_shell.html#qgroundcontrol-mavlink-console) or [through USB](https://docs.px4.io/main/en/debug/mavlink_shell.html#mavlink-shell-py).
# Commands
An overview of all commands can be found in the [px4-documentation](https://docs.px4.io/main/en/modules/modules_main.html). Interactively you can also send the command `help` to list all available commands.
## listener
`listener uorb_topic_name`
The listener commands allows to debug the uorb message system. It was important to debug the drivers of the magnetic sensors or to debug a working bridge.
## uORB top
`uorb top`
## micrortps_client
`micrortps_client`
# Custom Commands
## osd_payload_control
## indi control
- [ ] Make sure this title is correct #todo/b
## asr012
## a5047p

View File

@@ -0,0 +1 @@
In order to successfully build PX4 for different boards you need to configure them first by running the `boardconfig` option with the command.

View File

@@ -0,0 +1,24 @@
The first quick and dirty implementation was done on the companion computer using a ROS2 Node (see [[Direct Actuator Control from ROS2]]). But this will use too much bandwidth on the datalink between the flight controller and the companion computer.
# PX4 Implementation
We want to be able to open and close the payload bay even if the drone is not armed. Therefore, we need to make use of the pre-arm state that exists in order to check the functionality of non-dangerous [[Actuators |actuators]] (e.g. control surfaces. not motors/propellers). The arm, disarm and pre-arm configuration is explained on the [PX4 user guide](https://docs.px4.io/main/en/advanced_config/prearm_arm_disarm.html).
We want the following parameters:
| Parameter | Description | Value |
| ---- | ---- | ---- |
| COM_PREARM_MODE | ALWAYS | 2 |
| CBRK_IO_SAFETY | No Safety Switch | 22027 |
This puts the drone into the pre-arm state upon boot and allows to move the control surfaces immediately, without engaging the propellers.
## PWM Calibration
The payload servos should be calibrated like any other servo motor. We need to configure the following parameters (assuming the servo is MAIN_5):
| Parameter | Description | Value |
| ---- | ---- | ---- |
| PWM\_MAIN\_DIS5 | PWM value in disarm state | e.g. 1500 |
| PWM\_MAIN\_FAIL5 | PWM Value in failsafe state | e.g 1500 |
| PWM\_MAIN\_MAX5 | Maximum PWM value commanded | e.g. 1958 |
| PWM\_MAIN\_MIN5 | Minimum PWM value commanded | e.g. 867 |
The values are usually experimentally measured using a simple PWM controller (aka. servo tester).
## Payload Controller Module

View File

@@ -0,0 +1,24 @@
# FastDDS Bridge
This is the newest type of bridge and it allows to integrate ROS2 and PX4 much closer together.
The [newest development](https://www.youtube.com/watch?v=3zRCIsq_MCE&ab_channel=PX4Autopilot-OpenSourceFlightControl.) allows to easily define new flight modes that run on the companion computer instead of PX4. PX4 remains aware of this.
# MicroRTPS Bridge
## Troubleshooting
### Baudrate Mismatch
If the baudrate is not the same on both ends, no error is thrown, but the communication does not work.
In the screenshot below you can see the status when the baudrate had a mismatch (red rectangle), there were no received messages. Once the baudrate was correct the messages could be received (green rectangle)
![[Pasted image 20240311104419.png]]
The baudrates are changed like follows:
- on the PX4 you can change the parameter `SER_TEL_2_BAUD` if TELEM2 is used as the micrortps bridge. Or if you start the bridge with a nuttshell command you can use the argument `b`, e.g.: `micrortps_client start -t UART -d dev/ttyS2 -b 921600`
- on the companion computer side with the argument `-b`, when launching the micrortps agent: `./install/px4_ros_com/bin/micrortps_agent -b 921600 -d /dev/ttyAMA1 -g`
### ROS2 Topic Echo does not receive PX4-Data
If the bridge is setup correctly echoing the attitude message on the companion computer should show the messages. For it to work, the correct ROS2 environment has to be sourced and both ends of the bridge (agent on the companion computer, client on the px4-system) have to be running.
Run the following command: `ros2 topic echo /fmu/vehicle_attitude/out`.
Messages should start to appear.
If not check the following things:
- Is the cable connected?
- Have RX and TX cables been confused?
- Does the baudrate match on both ends? (see [[ROS2 Interface for PX4#Baudrate Mismatch]])
-

View File

@@ -0,0 +1,10 @@
The virtual swashplate algorithm is implemented in 2 modules within PX4.
# Parameters
It's important to have oneshot (or possibly dshot) as a protocol, because normal pwm signals are is too slow to do the actuation at our rpm.
In the flight controllers the PWM pins are grouped into timer groups, which use the same timer in the background and thus need to share the same protocol. For the [[Control Zero - Setup|Control Zero]] those groups are 1234, 56 and 78. We have chosen 7&8 as our virtual swashplate outputs.
| Parameter | Description | Value |
| ---- | ---- | ---- |
| PWM_MAIN_OUT | The pwm channels used as ESC outputs | 78 |
| PWM_MAIN_RATE | The pwm rate at which the signal is being generated. 0 Hz means that the [[PWM#Oneshot\|OneShot125]] protocol is used. | 0 Hz |

View File

@@ -0,0 +1,23 @@
In order to run the tablefinder demo we need the monocopter to be equipped with 2 cameras connected to a cm4.
On the cm4 we need the feature/table_finder branch checked out in the OSD-Autopilot repository.
## Run Instructions in the Field
1. Prepare drone
1. Plug in Battery
2. Connect field computer to the VPN (ZeroTier)
3. Make sure you can ssh into the drone's cm4
4. Make sure the LTE-connection is running (true if we can ssh into the drone)
5. Make sure that the GNSS node is running properly
6. Make sure that the mavlink-router is running properly
7. Make sure QGC has the connection to the PX4 Flight controller (should be the case if the mavlink-router is running and if the field-laptop is connected to the VPN)
2. Use VSCode via SSH to get multiple terminals (or use multiple terminal windows)
1. Terminal 1:
`ros2 launch osd_autopilot tf2.launch.py`
2. Terminal 2:
`ros2 launch px4_interface logged_px4_interface.launch.py`
3. If Visualization is desired on your field laptop run the following:
1. `ros2 launch px4_visualizer px4_rviz.launch.py` which should show a nice visualization of some of the data.
## Behaviortree version
The newer versions of the tablefinder demo will implement a [behavior tree logic](Behaviour%20Trees.md)