Co-Layout: LLM-driven Co-optimization for Interior Layout

Chucheng Xiang1, Ruchao Bao1, Biyin Feng2, Wenzheng Wu1, Zhongyuan Liu3, Yirui Guan3, Ligang Liu1*
1University of Science and Technology of China
2Tsinghua University
3Tencent
AAAI 2026

Abstract

We present a novel framework for automated interior design that combines large language models (LLMs) with grid based integer programming to jointly optimize room layout and furniture placement. Given a textual prompt, the LLM driven agent workflow extracts structured design constraints related to room configurations and furniture arrangements. These constraints are encoded into a unified grid-based representation inspired by “Modulor”. Our formulation accounts for key design requirements, including corridor connectivity, room accessibility, spatial exclusivity, and user-specified preferences. To improve computational efficiency, we adopt a coarse-to-fine optimization strategy that begins with a low resolution grid to solve a simplified problem and guides the solution at the full resolution. Experimental results across diverse scenarios demonstrate that our joint optimization approach significantly outperforms existing two-stage design pipelines in solution quality, and achieves notable computational efficiency through the coarse-to-fine strategy.

Pipeline

Pipeline overview

Fig 1. Overview of our automated framework. (a) First, the LLM-based workflow processes user requirements to generate spatial constraints for rooms and furniture, along with boundary conditions for the floor. (b) These constraints are then encoded into an integer programming model using our grid-based representation. The model is efficiently solved through a coarse-to-fine strategy to obtain the final layout. (c) The generated layout is converted into a scene white box in Blender. Suitable assets are then retrieved from the 3D-FUTURE dataset and a curated library of over 2,000 assets via semantic embedding and size matching to populate the scene.

Main Results

Main results

Fig 2. Various examples generated by our method, demonstrating its capability to handle diverse inputs including residential and non-residential spaces. Users can specify requirements such as building type, floor area, room functions, and required furniture. Top: Input text. Middle: Bird's-eye view of the scene. Bottom: Zoom-in views highlighting key details.


Result (a)


Result (b)


Result (c)


Result (d)

Comparisons

Comparisons with baselines

Fig 3. Comparison with baselines. Our approach consistently generates well-structured layouts with full accessibility and clear circulation. In contrast, baseline methods often produce designs with critical flaws, such as illogical circulation paths that violate privacy (e.g., Holodeck in a, d) and unreachable spaces or impractical room shapes (e.g., AnyHome in c, e).

Method OOR↓ OOB↓ IQA↑ IAA↑ CLIP↑
Holodeck 0.82 2.33 4.03 3.32 25.15
AnyHome 0.00 0.04 4.10 3.32 25.75
Ours 0.00 0.00 4.17 3.35 26.50

Table 1: Quantitative evaluation results. Our method achieves the best performance in terms of physical plausibility, image quality/aesthetics, and text-image alignment.


Method Semantic.↑ Layout.↑ Path.↑ MRR.↑
Holodeck 3.43 3.12 3.06 0.59
AnyHome 3.07 2.59 2.80 0.45
Ours 3.77 3.23 3.41 0.80

Table 2: User study results. Our method achieved the highest scores across all metrics.

Acknowledgements

We thank Dr. Ziqi Wang from The Hong Kong University of Science and Technology for his valuable guidance and advice on this work. We also thank the anonymous reviewers for their appreciation of our work. This work is supported by the National Natural Science Foundation of China (No.62025207).

BibTeX

@article{Xiang2026Co,
        title={Co-Layout: LLM-driven Co-optimization for Interior Layout},
        author={Xiang, Chucheng and Bao, Ruchao and Feng, Biyin and Wu, Wenzheng and Liu, Zhongyuan and Guan, Yirui and Liu, Ligang},
        journal={Proceedings of the AAAI Conference on Artificial Intelligence},
        volume={40},
        number={17},
        pages={14371-14379},
        year={2026},
        month={Mar.},
        doi={10.1609/aaai.v40i17.38452},
        url={https://ojs.aaai.org/index.php/AAAI/article/view/38452}
}