The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationError
Exception: TypeError
Message: Couldn't cast array of type string to null
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1594, in _prepare_split_single
writer.write(example)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 682, in write
self.write_examples_on_file()
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 655, in write_examples_on_file
self.write_batch(batch_examples=batch_examples)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 743, in write_batch
arrays.append(pa.array(typed_sequence))
^^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/array.pxi", line 256, in pyarrow.lib.array
File "pyarrow/array.pxi", line 118, in pyarrow.lib._handle_arrow_array_protocol
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 291, in __arrow_array__
out = self._arrow_array(type=type)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 362, in _arrow_array
out = cast_array_to_feature(
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2052, in cast_array_to_feature
casted_array_values = _c(array.values, feature.feature)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2052, in cast_array_to_feature
casted_array_values = _c(array.values, feature.feature)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2086, in cast_array_to_feature
return array_cast(
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1948, in array_cast
raise TypeError(f"Couldn't cast array of type {_short_str(array.type)} to {_short_str(pa_type)}")
TypeError: Couldn't cast array of type string to null
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1607, in _prepare_split_single
num_examples, num_bytes = writer.finalize()
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 770, in finalize
self.write_examples_on_file()
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 655, in write_examples_on_file
self.write_batch(batch_examples=batch_examples)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 743, in write_batch
arrays.append(pa.array(typed_sequence))
^^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/array.pxi", line 256, in pyarrow.lib.array
File "pyarrow/array.pxi", line 118, in pyarrow.lib._handle_arrow_array_protocol
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 291, in __arrow_array__
out = self._arrow_array(type=type)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 362, in _arrow_array
out = cast_array_to_feature(
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2052, in cast_array_to_feature
casted_array_values = _c(array.values, feature.feature)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2002, in cast_array_to_feature
_c(array.field(name) if name in array_fields else null_array, subfeature)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2052, in cast_array_to_feature
casted_array_values = _c(array.values, feature.feature)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2086, in cast_array_to_feature
return array_cast(
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1797, in wrapper
return func(array, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1948, in array_cast
raise TypeError(f"Couldn't cast array of type {_short_str(array.type)} to {_short_str(pa_type)}")
TypeError: Couldn't cast array of type string to null
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1342, in compute_config_parquet_and_info_response
parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 907, in stream_convert_to_parquet
builder._prepare_split(split_generator=splits_generators[split], file_format="parquet")
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1438, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1616, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the datasetNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
json dict | __key__ string | __url__ string |
|---|---|---|
{"bounds":[[-5.547107696533203,0.01875000074505806,-5.582967698574066],[5.6919237077236176,3.7472011(...TRUNCATED) | M3DLayout_json/3dfront_test | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-5.547107696533203,0.01875000074505806,-5.582967698574066],[5.6919237077236176,3.7472011(...TRUNCATED) | M3DLayout_json/3dfront_train | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-5.547107696533203,0.01875000074505806,-5.582967698574066],[5.6919237077236176,3.7472011(...TRUNCATED) | M3DLayout_json/3dfront_val | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-7.4526108130812645,0.0025049152318388224,-9.778667497448623],[7.363438450265676,5.46100(...TRUNCATED) | M3DLayout_json/infinigen_test_merged | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-9.348808721406385,0.0025027217343449593,-10.492386943195015],[9.32809216529131,8.764705(...TRUNCATED) | M3DLayout_json/infinigen_train_merged | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-8.146080430597067,0.002503159688785672,-9.023705360945314],[8.292041204869747,7.5502955(...TRUNCATED) | M3DLayout_json/infinigen_val_merged | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-13.487928,0.0029864299999999844,-23.7458945],[14.401651999999999,6.275994000000001,33.3(...TRUNCATED) | M3DLayout_json/mp3d_test | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-13.487928,0.0029864299999999844,-23.7458945],[14.401651999999999,6.275994000000001,33.3(...TRUNCATED) | M3DLayout_json/mp3d_train | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
{"bounds":[[-13.487928,0.0029864299999999844,-23.7458945],[14.401651999999999,6.275994000000001,33.3(...TRUNCATED) | M3DLayout_json/mp3d_val | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/layout_dataset/M3(...TRUNCATED) |
null | "3dfront_postprocessed_5173/0003d406-5f27-4bbf-94cd-1cff7c310ba1_LivingRoom-54780/rendered_scene_256(...TRUNCATED) | "hf://datasets/Metaverse-AI-Lab/M3DLayout@2afdd294747a25bbc1a786aa6a7e4f42965de10c/rendering_dataset(...TRUNCATED) |
M3DLayout: A Multi-Source Dataset of 3D Indoor Layouts and Structured Descriptions for 3D Generation
In text-driven 3D scene generation, object layout serves as a crucial intermediate representation that bridges high-level language instructions with detailed geometric output. It not only provides a structural blueprint for ensuring physical plausibility but also supports semantic controllability and interactive editing.
However, the learning capabilities of current 3D indoor layout generation models are constrained by the limited scale, diversity, and annotation quality of existing datasets. To address this, we introduce M3DLayout, a large-scale, multi-source dataset for 3D indoor layout generation. M3DLayout comprises 21,367 layouts and over 433k object instances, integrating three distinct sources: real-world scans, professional CAD designs, and procedurally generated scenes. Each layout is paired with detailed structured text describing global scene summaries, relational placements of large furniture, and fine-grained arrangements of smaller items. This diverse and richly annotated resource enables models to learn complex spatial and semantic patterns across a wide variety of indoor environments.
To assess the potential of M3DLayout, we establish a benchmark using a text-conditioned diffusion model. Experimental results demonstrate that our dataset provides a solid foundation for training layout generation models. Its multi-source composition enhances diversity, notably through the Inf3DLayout subset which provides rich small-object information, enabling the generation of more complex and detailed scenes. We hope that M3DLayout can serve as a valuable resource for advancing research in text-driven 3D scene synthesis.
Dataset Description
The dataset is separated into 3 parts:
| Type | Description | Content | Size |
|---|---|---|---|
scene_dataset |
Origin scene with separated object geometries and textures | • Infinigen (8392 rooms with normal object density + 7607 rooms with relatively low object density = 15999 rooms): scene.blend, scene with segemented objects and textures. • Matterport3D (90 houses): Postprocessed by segmenting each house into separated ply objects, you can import each subdir as a whole to Blender to receive a complete scene. • 3D-Front (5173 rooms): Since we have not applied any additional processing, you can download the original 3D-FRONT dataset directly from 3D-Front official link |
3T before uncompress |
rendering_dataset |
Rendered Images from scene | • Infinigen (15864 rooms): Floor masks, Oblique-view scene renderings, Top-down scene renderings, Text descriptions, Detailed per-scene JSON • Matterport3D (90 houses + 4 json files + 1 README.md): Floor masks for each room, Top-down layout renderings for each room, Multi-Level Detailed per-scene JSON • 3D-Front (5173 rooms): Floor masks, Top-down scene renderings |
250GB before uncompress |
layout_dataset |
Layout extracted from scene | <data_source>_train.json, <data_source>_test.json, and <data_source>_val.json for Infinigen & Matterport3D & 3D-Front. Including object count, category, location, bbox size, rotation, multi-level detailed description, etc. |
31MB before uncompress |
To be simple,
If you want to do Scene Generation/Understanding/Reconstruction, Embodied AI and so on, you can directly download the scene_dataset. Moreover, you can extract point cloud or do further Detection, Segmentation or Editing tasks since all objects in the scene are clearly separated.
If you want to do some image/text to layout/scene or some 2D tasks, you can download rendering_dataset.
If you want to utilize the intermediate scene layout for your downstream research, you can download layout_dataset.
We have provided abundant functions in render.py, util.py and visualization_mlayout.py from Object-Retrieval-Layout2Scene to postprocess (visualize/filter/rendering etc. ) the infinigen scene data.
Correlated Linkage
- Github Repository: https://github.com/Graphic-Kiliani/M3DLayout-code
- Paper: https://arxiv.org/abs/2509.23728
- Project Page: https://graphic-kiliani.github.io/M3DLayout/
Citation
If you find this dataset useful, please cite:
@article{zhang2025m3dlayout,
title={M3DLayout: A Multi-Source Dataset of 3D Indoor Layouts and Structured Descriptions for 3D Generation},
author={Yiheng, Zhang and Zhuojiang, Cai and Mingdao, Wang and Meitong, Guo and Tianxiao, Li and Li, Lin and Yuwang, Wang},
journal={arXiv preprint arXiv:2509.23728},
year={2025},
url={https://arxiv.org/abs/2509.23728},
}
Dataset Card Contact
If you have any question about our dataset or seek for any form of collaboration, feel free to contact 'e1349382@u.nus.com'.
- Downloads last month
- 668