What can we help you with?

< All Topics
Print

Going from Blender to SPACE

If you want to get your 3D assets from Blender to SPACE then there are some things you ought to know. This article will cover everything except for optimization, which deserves it’s own write-up that you can find here.

Let’s start with the file type. 

GLTF

GLTF is the “JPEG for 3D models” and has become the standard interchange format for displaying 3D models on the web (and in other places too, like Microsoft Flight Simulator). They can be used in almost any engine and are especially useful in Web3D applications like Three.js, Babylon.js, Cesium, etc.

GLTF is structured in two parts; the “GLTF” and the “GLB”. GLB is the most common; it is a 3D file that includes everything that describes the object in one file. This is what you will use for any files that you upload to SPACE.

GLTF is like a GLB except it can be decomposed into separate texture and data files. This has the benefit of being able to swap out textures without having to swap out the entire model.

If you would like to test your file before uploading it, there are two viewers that you can drag and drop the file into to preview it. These are Babylon and Don McCurdy’s GLTF Viewer.

A standard GLTF test model running in a browser. Credit.

The number of companies supporting  the GLTF standard is growing every day. Credit: Worldviz.

Basic Asset Elements

The Model

The elements that make up a 3D mesh. Credit Wikipedia.

A 3D model is composed of one basic element: the triangle or “polygon”. These triangles can be decomposed further into three elements: a face, three edges and three vertices. 3D modeling is the process of manipulating these elements to create a form in space. All together, that resulting network of polygons is referred to as a Mesh. This is what your user will see when they interact with your object.

Materials

A simple polygon mesh contains no color information. What gives it life and makes a cube look like a marble block is it’s material. Materials are a combination of several textures and a shading algorithm that modifies the light that impacts the polygon. This modified light is displayed on your screen as pixels, which is what your eye interprets as a rough countertop or a smooth mirror. This sounds complex, but in practice it’s just dragging a bunch of sliders and changing pictures to tell the shader how shiny something is.

An example material on a sphere. Credit AmbientCG.

Textures

Albedo. Credit AmbientCG.

Roughness. Credit AmbientCG.

Ambient Occlusion. Credit AmbientCG.

Normal Map. Credit AmbientCG.

Textures are images that you apply to your 3D model using it’s material. You can use any image as a texture – even a photo that you took with your camera. Most engines, including Three.JS and Babylon.JS (standard Web3D engines that load GLTF files) use PBR shading and you will see this term used a lot. It stands for Physically Based Rendering and refers to a standardized arrangement of textures and shading math across multiple platforms. PBR breaks materials into the following layers or channels:

  • Albedo (aka Diffuse): What’s the object’s color?
  • Reflectivity (aka specular): How shiny is the object?
  • Roughness: How sharp is the reflection?
  • Ambient Occlusion: A texture for mapping the tiny shadows that occur between two bricks, for example.

You will use a value (from 0 to 1) or an image texture to set these channels and give life to your objects. For a super detailed breakdown on the theory and concepts behind PBR, check out Physically-Based Rendering, And You Can Too! and Basic Theory of Physically-Based Rendering by Marmoset Toolbag.

UV Unwrapping

UV unwrapping a cube. Credit Wikipedia.

In order to map a 2D image onto a 3D object, we unfold that object flat using a process called UV Unwrapping. You can think of this like making a shirt, then cutting it apart at the seams and laying it flat so you can draw on it.

SPACE/Hubs-Specific

If you are creating an environment for SPACE you should know about Lightmaps. These are images that we use to give a space more realistic lighting. In CG movies like Toy Story, a single frame with realistic lighting can take more than three hours to trace out all the light bounces and paint a realistic scene. In an interactive application we obviously can’t do that, so instead we pre-calculate the lighting and apply it to the model as an image map.

To do this you will need to bake the texture. The process can be found in this article on Lightmapping using space. In order to use the Lightmap you will need to plug it into a “MOZ_Lightmap” node.

In order to use the full extent of SPACE’s capabilities, including lightmapping, you will need to download the Mozilla Hubs plugin for Blender here, along with installation instructions.

Other things you can do with this plugin include creating speaker and mic setups, audio falloff zones, placing lights in Blender and more.

Lightmap for the Small Shop on SPACE Metaverse.

Exporting to SPACE

Mozilla Hubs has a great breakdown of things to know about when using the Blender GLTF exporter and getting a model into SPACE/Hubs. You can find it here. The only thing it does not cover is how to set up Ambient Occlusion, so there is an article about it on this wiki here.

Importing to SPACE

Once your GLTF is exported, it’s time to get it into SPACE. Go to the Spoke editor (https://space-metaverse.com/spoke/) after logging into SPACE Metaverse. On the Projects window, select your project or click “New Project”, then select “New Empty Project”.

Once in the project you’ll be greeted with Spoke’s interface, which is labeled below.

To upload your asset, simply navigate to My Assets in the Asset Browser at the bottom of the screen. You can click the Upload button and find your file, or simply drag and drop it from your computer into this window.

To place an asset in the world space, simply click on it once, then move your mouse cursor into the 3D viewport. The asset will appear on your cursor and you can click again to place it where you want.

To manipulate the location, size and rotation of the asset, you can click on the buttons on the top left of the screen or use the shortcuts: “T” for Transform, “R” for Rotate and “Y” for Scale. You may want to turn off snapping; to do so click the magnet button to make sure it is not highlighted. To adjust snap increments, use the two boxes to the right of the magnet.

If you would like to update an asset, upload the new version of it to the Asset Browser. Right click on the new version of the asset and select “Copy URL”. In the 3D Viewport or in the Scene Browser, click on your asset and find the Model URL: entry box. Simply paste the new URL into this box and your asset will be updated in the 3D space.

Previous Elements of a Virtual Experience
Next Lightmap Baking
Table of Contents