Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/core/examples/guides/camera-calibration.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ Instead of using `CameraStreamer`, advanced users can follow this process with a

:::

Once you have a established an image stream, make sure to generate a **checkerboard** pattern (e.g., from
Once you have established the image stream, make sure to generate a **checkerboard** pattern (e.g., from
[here](https://calib.io/pages/camera-calibration-pattern-generator)). The calibrator will use this pattern to determine
how the picture is distorted and ultimately generate the necessary matrices that can be used to undistort images from
your camera. Take note of the checkerboard width, height, and box size as you will need it later. Print the checkerboard
Expand Down
4 changes: 2 additions & 2 deletions docs/core/examples/guides/camera-streamer.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Before pressing Start, let us go through the parameters first. You should see:
- **Rate**: This is the component's rate, but it has no effect on the operation of `CameraStreamer`.
- **Source**: Path to the source device or video file. If using a camera, this is typically of the form `/dev/videoX`,
whereas for video files you need to provide the absolute path to the video, e.g., `/path/to/video`.
- **Camera frame**: The reference frame that will be used when publishing image messages, which should correspond the
- **Camera frame**: The reference frame that will be used when publishing image messages, which should correspond to the
camera's sensor position.
- **Camera configuration**: A YAML-formatted camera configuration file containing the camera intrinsics (optional). If
you don't have a calibration file for your camera, you can follow our [calibration guide](./camera-calibration.md).
Expand Down Expand Up @@ -79,7 +79,7 @@ In newer versions of `CameraStreamer` you will also have access to:
Once you have selected an appropriate **source**:

1. Press **Start** to start the application.
2. To see the live camera feed, click on the gear icon on the bottom right and select **Launch RViz**.
2. To see the live camera feed, select **Launch RViz** from the Launcher settings
3. In RViz, select _Add > By topic > /camera_streamer/image > Image_. This adds a panel that shows the live image. The
undistorted image (if available) can also be found under _/camera_streamer/undistorted_image > Image_.

Expand Down
4 changes: 2 additions & 2 deletions docs/core/examples/guides/jtc-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,12 +98,12 @@ There are 2 ways of setting a trajectory in JTC:
In both cases, receiving a new joint trajectory will first trigger cancellation of an active trajectory, if there is
one. That is, **there is no trajectory buffering or appending taking place**. As with many things in the AICA Universe,
behaviors are event-driven. If you wish to send multiple trajectories back-to-back, you will have to rely on the
execution status of the active trajectory handled by JTC. There is a practical example of how do this in following
execution status of the active trajectory handled by JTC. There is a practical example of how to do this in following
sections (see [Putting an application together](#putting-an-application-together)).

### Trajectory execution status

The controller exposes 4 predicates to reflect the the execution status of a trajectory, namely:
The controller exposes 4 predicates to reflect the execution status of a trajectory, namely:

- `Has active trajectory`: A trajectory has been set and is being executed
- `Has trajectory succeeded`: A trajectory was executed successfully (i.e., reached the final waypoint within all
Expand Down
4 changes: 2 additions & 2 deletions docs/core/examples/guides/ur-sim-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Executing the following commands runs URSim in a Docker container:

```bash
git clone https://github.com/aica-technology/simulators.git
cd simulators
cd simulators/ursim
./run.sh
```

Expand Down Expand Up @@ -94,7 +94,7 @@ Follow the terminal link in a browser to access the simulated robot.

1. In the window that appears, select **Connect**.
2. After the teaching pendant interface loads up, navigate to the settings page by clicking the burger icon in the top
left corner of the screen.
right corner of the screen.
3. Click on the **System** tab, then select the **Remote Control** tab.
4. Click **Enable** and then **Exit** at the bottom left of the screen.
5. Turn on the robot by pressing the red button located in the bottom left corner of the screen. Click **ON** followed
Expand Down