Skip to content

Commit b81f483

Browse files
committed
feat(lab): Upgrade master docs
1 parent 8687a4c commit b81f483

File tree

345 files changed

+3055
-726
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

345 files changed

+3055
-726
lines changed
Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Model Zoo: Loading Pre-trained Navigation Models\n\nThis tutorial demonstrates how to load and use the pre-trained\nState-of-the-Art (SOTA) navigation models provided by the\n**RoboOrchardLab**.\n"
8+
]
9+
},
10+
{
11+
"cell_type": "code",
12+
"execution_count": null,
13+
"metadata": {
14+
"collapsed": false
15+
},
16+
"outputs": [],
17+
"source": [
18+
"# sphinx_gallery_thumbnail_path = '_static/images/sphx_glr_install_thumb.png'"
19+
]
20+
},
21+
{
22+
"cell_type": "markdown",
23+
"metadata": {},
24+
"source": [
25+
"## Aux-Think: Exploring Reasoning Strategies for Data-Efficient Vision-Language Navigation\n\n[Click here to visit the homepage.](https://horizonrobotics.github.io/robot_lab/aux-think/index.html)_\n\n### Loading Pretrained Model\n\n```python\nimport torch\nfrom robo_orchard_lab.models import TorchModelMixin\n\nmodel: torch.nn.Module = TorchModelMixin.load(\"hf://HorizonRobotics/Aux-Think\")\n```\n### Inference Pipeline\n\n```python\nimport torch\nfrom robo_orchard_lab.inference import InferencePipelineMixin\nfrom robo_orchard_lab.processors.auxthink_processor import AuxThinkInput\n\n# -----------------------------\n# Step 1. Load a saved pipeline\n# -----------------------------\npipeline = InferencePipelineMixin.load(\"hf://HorizonRobotics/Aux-Think\")\npipeline.model.eval()\n\n# -----------------------------\n# Step 2. Prepare raw input\n# -----------------------------\ndata = AuxThinkInput(\n image_paths=[\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_0.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_1.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_2.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_3.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_4.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_5.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_6.png\",\n \"hf://HorizonRobotics/Aux-Think/data_example/rgb_7.png\",\n ],\n instruction=\"Go around the kitchen island and wait between the tall cabinet and wine fridge.\"\n)\n\n# -----------------------------\n# Step 3. Run inference\n# (pre_process \u2192 collate \u2192 model \u2192 post_process)\n# -----------------------------\nresult = pipeline(data)\nprint(result.text)\n\n# Example Output:\n# \"The next action is move forward 25 cm, turn left 45 degrees, turn left 15 degrees.\"\n\n\n# -----------------------------\n# Step 4. Batch inference (optional)\n# -----------------------------\nbatch_data = [data, data]\nbatch_results = list(pipeline(batch_data))\nfor r in batch_results:\n print(r.text)\n```\n"
26+
]
27+
}
28+
],
29+
"metadata": {
30+
"kernelspec": {
31+
"display_name": "Python 3",
32+
"language": "python",
33+
"name": "python3"
34+
},
35+
"language_info": {
36+
"codemirror_mode": {
37+
"name": "ipython",
38+
"version": 3
39+
},
40+
"file_extension": ".py",
41+
"mimetype": "text/x-python",
42+
"name": "python",
43+
"nbconvert_exporter": "python",
44+
"pygments_lexer": "ipython3",
45+
"version": "3.10.12"
46+
}
47+
},
48+
"nbformat": 4,
49+
"nbformat_minor": 0
50+
}
Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
# Project RoboOrchard
2+
#
3+
# Copyright (c) 2024-2025 Horizon Robotics. All Rights Reserved.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
14+
# implied. See the License for the specific language governing
15+
# permissions and limitations under the License.
16+
17+
# ruff: noqa: E501 D415 D205 E402
18+
19+
"""Model Zoo: Loading Pre-trained Navigation Models
20+
=================================================================
21+
22+
This tutorial demonstrates how to load and use the pre-trained
23+
State-of-the-Art (SOTA) navigation models provided by the
24+
**RoboOrchardLab**.
25+
"""
26+
27+
# sphinx_gallery_thumbnail_path = '_static/images/sphx_glr_install_thumb.png'
28+
29+
# %%
30+
# Aux-Think: Exploring Reasoning Strategies for Data-Efficient Vision-Language Navigation
31+
# --------------------------------------------------------------------------------------------
32+
#
33+
# `Click here to visit the homepage. <https://horizonrobotics.github.io/robot_lab/aux-think/index.html>`__
34+
#
35+
# Loading Pretrained Model
36+
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
37+
#
38+
# .. code-block:: python
39+
#
40+
# import torch
41+
# from robo_orchard_lab.models import TorchModelMixin
42+
#
43+
# model: torch.nn.Module = TorchModelMixin.load("hf://HorizonRobotics/Aux-Think")
44+
#
45+
# Inference Pipeline
46+
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
47+
#
48+
# .. code-block:: python
49+
#
50+
# import torch
51+
# from robo_orchard_lab.inference import InferencePipelineMixin
52+
# from robo_orchard_lab.processors.auxthink_processor import AuxThinkInput
53+
#
54+
# # -----------------------------
55+
# # Step 1. Load a saved pipeline
56+
# # -----------------------------
57+
# pipeline = InferencePipelineMixin.load("hf://HorizonRobotics/Aux-Think")
58+
# pipeline.model.eval()
59+
#
60+
# # -----------------------------
61+
# # Step 2. Prepare raw input
62+
# # -----------------------------
63+
# data = AuxThinkInput(
64+
# image_paths=[
65+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_0.png",
66+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_1.png",
67+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_2.png",
68+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_3.png",
69+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_4.png",
70+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_5.png",
71+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_6.png",
72+
# "hf://HorizonRobotics/Aux-Think/data_example/rgb_7.png",
73+
# ],
74+
# instruction="Go around the kitchen island and wait between the tall cabinet and wine fridge."
75+
# )
76+
#
77+
# # -----------------------------
78+
# # Step 3. Run inference
79+
# # (pre_process → collate → model → post_process)
80+
# # -----------------------------
81+
# result = pipeline(data)
82+
# print(result.text)
83+
#
84+
# # Example Output:
85+
# # "The next action is move forward 25 cm, turn left 45 degrees, turn left 15 degrees."
86+
#
87+
#
88+
# # -----------------------------
89+
# # Step 4. Batch inference (optional)
90+
# # -----------------------------
91+
# batch_data = [data, data]
92+
# batch_results = list(pipeline(batch_data))
93+
# for r in batch_results:
94+
# print(r.text)

robo_orchard/lab/master/_downloads/adb6768b0bf122a0b7d69c666a3a53eb/nonb-05_locomotion_models.ipynb

Lines changed: 0 additions & 50 deletions
This file was deleted.

robo_orchard/lab/master/_downloads/d65b47d7a63f2b3b4fc1f68581179d77/nonb-05_locomotion_models.py

Lines changed: 0 additions & 48 deletions
This file was deleted.

robo_orchard/lab/master/_images/sphx_glr_nonb-05_locomotion_models_thumb.png renamed to robo_orchard/lab/master/_images/sphx_glr_nonb-05_navigation_models_thumb.png

File renamed without changes.

robo_orchard/lab/master/_modules/index.html

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,7 @@
374374
<li class="toctree-l2"><a class="reference internal" href="../build/tutorials/model_zoo_tutorial/nonb-02_inference_api.html">Creating, Saving, and Loading Inference Pipelines</a></li>
375375
<li class="toctree-l2"><a class="reference internal" href="../build/tutorials/model_zoo_tutorial/nonb-03_perception_models.html">Model Zoo: Loading Pre-trained Perception Models</a></li>
376376
<li class="toctree-l2"><a class="reference internal" href="../build/tutorials/model_zoo_tutorial/nonb-04_manipulation_models.html">Model Zoo: Loading Pre-trained Manipulation Models</a></li>
377-
<li class="toctree-l2"><a class="reference internal" href="../build/tutorials/model_zoo_tutorial/nonb-05_locomotion_models.html">Model Zoo: Loading Pre-trained Locomotion Models</a></li>
377+
<li class="toctree-l2"><a class="reference internal" href="../build/tutorials/model_zoo_tutorial/nonb-05_navigation_models.html">Model Zoo: Loading Pre-trained Navigation Models</a></li>
378378
</ul>
379379
</li>
380380
</ul>
@@ -487,6 +487,7 @@
487487
<li class="toctree-l3 has-children"><a class="reference internal" href="../autoapi/robo_orchard_lab/models/index.html">models</a><input class="toctree-checkbox" id="toctree-checkbox-26" name="toctree-checkbox-26" role="switch" type="checkbox"/><label for="toctree-checkbox-26"><div class="visually-hidden">Toggle navigation of models</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
488488
<li class="toctree-l4 has-children"><a class="reference internal" href="../autoapi/robo_orchard_lab/models/aux_think/index.html">aux_think</a><input class="toctree-checkbox" id="toctree-checkbox-27" name="toctree-checkbox-27" role="switch" type="checkbox"/><label for="toctree-checkbox-27"><div class="visually-hidden">Toggle navigation of aux_think</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
489489
<li class="toctree-l5"><a class="reference internal" href="../autoapi/robo_orchard_lab/models/aux_think/model/index.html">model</a></li>
490+
<li class="toctree-l5"><a class="reference internal" href="../autoapi/robo_orchard_lab/models/aux_think/processor/index.html">processor</a></li>
490491
</ul>
491492
</li>
492493
<li class="toctree-l4 has-children"><a class="reference internal" href="../autoapi/robo_orchard_lab/models/bip3d/index.html">bip3d</a><input class="toctree-checkbox" id="toctree-checkbox-28" name="toctree-checkbox-28" role="switch" type="checkbox"/><label for="toctree-checkbox-28"><div class="visually-hidden">Toggle navigation of bip3d</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
@@ -715,6 +716,7 @@ <h1>All modules for which code is available</h1>
715716
<li><a href="robo_orchard_lab/inference/processor/mixin.html">robo_orchard_lab.inference.processor.mixin</a></li>
716717
<li><a href="robo_orchard_lab/metrics/base.html">robo_orchard_lab.metrics.base</a></li>
717718
<li><a href="robo_orchard_lab/models/aux_think/model.html">robo_orchard_lab.models.aux_think.model</a></li>
719+
<li><a href="robo_orchard_lab/models/aux_think/processor.html">robo_orchard_lab.models.aux_think.processor</a></li>
718720
<li><a href="robo_orchard_lab/models/bip3d/bert.html">robo_orchard_lab.models.bip3d.bert</a></li>
719721
<li><a href="robo_orchard_lab/models/bip3d/feature_enhancer.html">robo_orchard_lab.models.bip3d.feature_enhancer</a></li>
720722
<li><a href="robo_orchard_lab/models/bip3d/grounding_decoder/bbox3d_decoder.html">robo_orchard_lab.models.bip3d.grounding_decoder.bbox3d_decoder</a></li>

robo_orchard/lab/master/_modules/robo_orchard_lab/dataset/collates.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,7 @@
374374
<li class="toctree-l2"><a class="reference internal" href="../../../build/tutorials/model_zoo_tutorial/nonb-02_inference_api.html">Creating, Saving, and Loading Inference Pipelines</a></li>
375375
<li class="toctree-l2"><a class="reference internal" href="../../../build/tutorials/model_zoo_tutorial/nonb-03_perception_models.html">Model Zoo: Loading Pre-trained Perception Models</a></li>
376376
<li class="toctree-l2"><a class="reference internal" href="../../../build/tutorials/model_zoo_tutorial/nonb-04_manipulation_models.html">Model Zoo: Loading Pre-trained Manipulation Models</a></li>
377-
<li class="toctree-l2"><a class="reference internal" href="../../../build/tutorials/model_zoo_tutorial/nonb-05_locomotion_models.html">Model Zoo: Loading Pre-trained Locomotion Models</a></li>
377+
<li class="toctree-l2"><a class="reference internal" href="../../../build/tutorials/model_zoo_tutorial/nonb-05_navigation_models.html">Model Zoo: Loading Pre-trained Navigation Models</a></li>
378378
</ul>
379379
</li>
380380
</ul>
@@ -487,6 +487,7 @@
487487
<li class="toctree-l3 has-children"><a class="reference internal" href="../../../autoapi/robo_orchard_lab/models/index.html">models</a><input class="toctree-checkbox" id="toctree-checkbox-26" name="toctree-checkbox-26" role="switch" type="checkbox"/><label for="toctree-checkbox-26"><div class="visually-hidden">Toggle navigation of models</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
488488
<li class="toctree-l4 has-children"><a class="reference internal" href="../../../autoapi/robo_orchard_lab/models/aux_think/index.html">aux_think</a><input class="toctree-checkbox" id="toctree-checkbox-27" name="toctree-checkbox-27" role="switch" type="checkbox"/><label for="toctree-checkbox-27"><div class="visually-hidden">Toggle navigation of aux_think</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
489489
<li class="toctree-l5"><a class="reference internal" href="../../../autoapi/robo_orchard_lab/models/aux_think/model/index.html">model</a></li>
490+
<li class="toctree-l5"><a class="reference internal" href="../../../autoapi/robo_orchard_lab/models/aux_think/processor/index.html">processor</a></li>
490491
</ul>
491492
</li>
492493
<li class="toctree-l4 has-children"><a class="reference internal" href="../../../autoapi/robo_orchard_lab/models/bip3d/index.html">bip3d</a><input class="toctree-checkbox" id="toctree-checkbox-28" name="toctree-checkbox-28" role="switch" type="checkbox"/><label for="toctree-checkbox-28"><div class="visually-hidden">Toggle navigation of bip3d</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>

robo_orchard/lab/master/_modules/robo_orchard_lab/dataset/datatypes/camera.html

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,7 @@
374374
<li class="toctree-l2"><a class="reference internal" href="../../../../build/tutorials/model_zoo_tutorial/nonb-02_inference_api.html">Creating, Saving, and Loading Inference Pipelines</a></li>
375375
<li class="toctree-l2"><a class="reference internal" href="../../../../build/tutorials/model_zoo_tutorial/nonb-03_perception_models.html">Model Zoo: Loading Pre-trained Perception Models</a></li>
376376
<li class="toctree-l2"><a class="reference internal" href="../../../../build/tutorials/model_zoo_tutorial/nonb-04_manipulation_models.html">Model Zoo: Loading Pre-trained Manipulation Models</a></li>
377-
<li class="toctree-l2"><a class="reference internal" href="../../../../build/tutorials/model_zoo_tutorial/nonb-05_locomotion_models.html">Model Zoo: Loading Pre-trained Locomotion Models</a></li>
377+
<li class="toctree-l2"><a class="reference internal" href="../../../../build/tutorials/model_zoo_tutorial/nonb-05_navigation_models.html">Model Zoo: Loading Pre-trained Navigation Models</a></li>
378378
</ul>
379379
</li>
380380
</ul>
@@ -487,6 +487,7 @@
487487
<li class="toctree-l3 has-children"><a class="reference internal" href="../../../../autoapi/robo_orchard_lab/models/index.html">models</a><input class="toctree-checkbox" id="toctree-checkbox-26" name="toctree-checkbox-26" role="switch" type="checkbox"/><label for="toctree-checkbox-26"><div class="visually-hidden">Toggle navigation of models</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
488488
<li class="toctree-l4 has-children"><a class="reference internal" href="../../../../autoapi/robo_orchard_lab/models/aux_think/index.html">aux_think</a><input class="toctree-checkbox" id="toctree-checkbox-27" name="toctree-checkbox-27" role="switch" type="checkbox"/><label for="toctree-checkbox-27"><div class="visually-hidden">Toggle navigation of aux_think</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>
489489
<li class="toctree-l5"><a class="reference internal" href="../../../../autoapi/robo_orchard_lab/models/aux_think/model/index.html">model</a></li>
490+
<li class="toctree-l5"><a class="reference internal" href="../../../../autoapi/robo_orchard_lab/models/aux_think/processor/index.html">processor</a></li>
490491
</ul>
491492
</li>
492493
<li class="toctree-l4 has-children"><a class="reference internal" href="../../../../autoapi/robo_orchard_lab/models/bip3d/index.html">bip3d</a><input class="toctree-checkbox" id="toctree-checkbox-28" name="toctree-checkbox-28" role="switch" type="checkbox"/><label for="toctree-checkbox-28"><div class="visually-hidden">Toggle navigation of bip3d</div><i class="icon"><svg><use href="#svg-arrow-right"></use></svg></i></label><ul>

0 commit comments

Comments
 (0)