Papers
arxiv:2603.05438

Planning in 8 Tokens: A Compact Discrete Tokenizer for Latent World Model

Published on Mar 5
ยท Submitted by
Adina Yakefu
on Mar 9
#2 Paper of the day
Authors:
,
,

Abstract

CompACT, a discrete tokenizer that reduces observation encoding from hundreds to 8 tokens, enables faster and more efficient world model planning for real-time control applications.

AI-generated summary

World models provide a powerful framework for simulating environment dynamics conditioned on actions or instructions, enabling downstream tasks such as action planning or policy learning. Recent approaches leverage world models as learned simulators, but its application to decision-time planning remains computationally prohibitive for real-time control. A key bottleneck lies in latent representations: conventional tokenizers encode each observation into hundreds of tokens, making planning both slow and resource-intensive. To address this, we propose CompACT, a discrete tokenizer that compresses each observation into as few as 8 tokens, drastically reducing computational cost while preserving essential information for planning. An action-conditioned world model that occupies CompACT tokenizer achieves competitive planning performance with orders-of-magnitude faster planning, offering a practical step toward real-world deployment of world models.

Community

Paper submitter

CompACT, a discrete tokenizer that reduces observation encoding from hundreds to 8 tokens, enables faster and more efficient world model planning for real-time control applications.

Paper author
โ€ข
edited 1 day ago

8-16 tokens per frame for planning, with a frozen DINOv3 backbone and a learnable latent resampler, is wild. i'd like to see how robust that compact latent space is when important but rare cues get compressed away, especially for longer-horizon tasks. the breakdown on arxivlens was solid and helped me sanity-check the token flow while skimming, nice to have a quick walkthrough: https://arxivlens.com/PaperView/Details/planning-in-8-tokens-a-compact-discrete-tokenizer-for-latent-world-model-6795-84bb8360

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2603.05438 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2603.05438 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2603.05438 in a Space README.md to link it from this page.

Collections including this paper 1