24 lines
1.2 KiB
Markdown
24 lines
1.2 KiB
Markdown
|
|
In order to run the tablefinder demo we need the monocopter to be equipped with 2 cameras connected to a cm4.
|
|
On the cm4 we need the feature/table_finder branch checked out in the OSD-Autopilot repository.
|
|
|
|
## Run Instructions in the Field
|
|
1. Prepare drone
|
|
1. Plug in Battery
|
|
2. Connect field computer to the VPN (ZeroTier)
|
|
3. Make sure you can ssh into the drone's cm4
|
|
4. Make sure the LTE-connection is running (true if we can ssh into the drone)
|
|
5. Make sure that the GNSS node is running properly
|
|
6. Make sure that the mavlink-router is running properly
|
|
7. Make sure QGC has the connection to the PX4 Flight controller (should be the case if the mavlink-router is running and if the field-laptop is connected to the VPN)
|
|
2. Use VSCode via SSH to get multiple terminals (or use multiple terminal windows)
|
|
1. Terminal 1:
|
|
`ros2 launch osd_autopilot tf2.launch.py`
|
|
2. Terminal 2:
|
|
`ros2 launch px4_interface logged_px4_interface.launch.py`
|
|
3. If Visualization is desired on your field laptop run the following:
|
|
1. `ros2 launch px4_visualizer px4_rviz.launch.py` which should show a nice visualization of some of the data.
|
|
|
|
## Behaviortree version
|
|
The newer versions of the tablefinder demo will implement a [behavior tree logic](Behaviour%20Trees.md)
|