
Tencent has announced the global launch of its Hunyuan 3D engine. The platform is designed to enable creatives and businesses to generate 3D objects faster and with less specialized knowledge. Instead of traditional modeling work, the tools use multimodal inputs such as text, reference images, or sketches to generate usable 3D assets.
According to Tencent, this shortens the path from concept to usable model from days or weeks to a few minutes. The engine targets commercially usable 3D content suitable for games, e-commerce, film and advertising productions, social media content, and 3D printing applications. Users of the global offering receive a quota of 20 generations per day. Companies that connect to the Hunyuan 3D Model API via Tencent Cloud receive 200 free credits for asset generation. Tencent is thus addressing both individuals and organizations that regularly work with 3D content.
The technical basis is a generative large-scale AI model from the Hunyuan series, which covers text-to-image, video, and 3D generation. Since Tencent made several 3D models available as open source in November 2024, the community has recorded more than three million downloads. The series underwent several iterations to improve the quality and accuracy of the generated geometries. Hunyuan 3D 3.0 focuses on high-quality object assets, while specialized Hunyuan3D World variants can generate walkable environments. These environments can be inserted into game or VR scenarios, for example.
The engine is available to companies via Tencent Cloud as an API. This allows generation to be integrated directly into existing production pipelines. According to the provider, more than 150 companies in mainland China use the service. These include Unity China, consumer 3D printer manufacturer Bambu Lab, and content platform Liblib. In the field of 3D printing, the ability to generate models for specific applications from textual or visual specifications much more quickly can simplify development processes. This results in more variants in less time.