TrajBooster: Boosting Humanoid Whole-Body Manipulation via Trajectory-Centric Learning
Paper
•
2509.11839
•
Published
Error code: StreamingRowsError
Exception: CastError
Message: Couldn't cast
jsonl: binary
__key__: string
__url__: string
mp4: null
to
{'mp4': Value('binary'), '__key__': Value('string'), '__url__': Value('string')}
because column names don't match
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
return get_rows(
File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
return func(*args, **kwargs)
File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2361, in __iter__
for key, example in ex_iterable:
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1882, in __iter__
for key, pa_table in self._iter_arrow():
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1914, in _iter_arrow
pa_table = cast_table_to_features(pa_table, self.features)
File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2192, in cast_table_to_features
raise CastError(
datasets.table.CastError: Couldn't cast
jsonl: binary
__key__: string
__url__: string
mp4: null
to
{'mp4': Value('binary'), '__key__': Value('string'), '__url__': Value('string')}
because column names don't matchNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Paper | Project Page | Code
This dataset contains action retargeting data from Agibot to UnitreeG1 humanoid robot.
pip install huggingface-hub
from huggingface_hub import snapshot_download
# Download the entire dataset
snapshot_download(
repo_id="l2aggle/Agibot2UnitreeG1Retarget",
repo_type="dataset",
local_dir="./Agibot2UnitreeG1Retarget"
)
# Make sure git-lfs is installed
git lfs install
# Clone the repository (this will download LFS pointer files)
git clone https://huggingface.co/datasets/l2aggle/Agibot2UnitreeG1Retarget
cd Agibot2UnitreeG1Retarget
# Download the actual large files
git lfs pull
Download individual parts through the Hugging Face web interface: https://huggingface.co/datasets/l2aggle/Agibot2UnitreeG1Retarget/tree/main
After downloading, extract the complete dataset:
# Combine and extract all parts
cat A2UG1_dataset.tar.gz.* | tar -xzf -
This will create the complete A2UG1_dataset folder with all original files.
A2UG1_dataset/
├── [your dataset structure will be shown here after extraction]
huggingface-hub package# For Method 1
pip install huggingface-hub
# For Method 2 (if git-lfs not installed)
# Ubuntu/Debian:
sudo apt install git-lfs
# macOS:
brew install git-lfs
# Windows: download from https://git-lfs.github.io/
Apache 2.0