Welcome to GLOW

We’re really proud to introduce GLOW – a WebGL wrapper. GLOW is for you who have some WebGL experience and would like to get to the core – shaders. Please try out GLOW for experimenting with shaders, to create your custom render pipe or make a WebGL demo. Please fork here.

GLOW was initiated right after we finished ROME – Three Dreams of Black. My main insight from ROME was the power of shaders and I wanted a way to work with this, not having to care about low-level WebGL and high-level abstractions.

Soon Olov Lassus joined and we’ve pushed GLOW to be clean and simple, yet powerful. Check out JSShaper and Restrict Mode that Olov is juggling as well.

2012+09+04

Build With Chrome: GLego

I was driving too fast on the highway when the phone rang and I was asked by North Kingdom if I could help with a very specific task: to render as many Lego bricks as possible, using WebGL. I had no clue how to do it, but didn’t hesitate one second – of course, I’m in. Later that day, an idea landed and started to grow. After a lot of twists and turns, it ended up as Build With Chrome and a render technique I’ll try to explain here. But first…

GLOW as Boilerplate

The GLego-framework uses GLOW as boilerplate (that’s why this is posted on this blog) and it worked out pretty well. If you’re new to GLOW, it is a low level WebGL wrapper that does pretty much nothing but wraps the (sometimes) daunting WebGL API into something more readable.

There were some kinks that needed some ironing and missing features, which were added during the development. It’s all in the Github-repo now if you like to use it. There might be some features missing but GLOW is pretty much “done” and ready to use in productions.

Unity as Level Editor

Even though our geometry is very simple, we needed a good middle man between whatever 3D-package the artists used and WebGL. We turned to the fantastic J3D-Unity exporter by Bartek Drozdz, modified it to our needs and was up and running. I highly recommend using this path, especially if you’re doing complex levels and stuff. You should be able to use the free version of Unity, without any practical limitations.

The Build Renderer

We needed two renderers – one for the build mode and one for the browse mode. The build mode renderer is just an ordinary renderer, using geometry and lights like most renderers do. To avoid WebGL state switching and keep performance up, the drawing order is optimized so the same type of geometry, colors and shaders is rendered in sequence. GLOW does the heavy lifting here, keeping track of all (or most) WebGL-states. The render loop pretty much looks like…

var type = renderCue.length,
  numColors = renderCue[ 0 ].length,
  color, objects, numObjects, object, c;

// draw all objects in type-color-order
while( type-- ) {
  color = numColors;
  while( color-- ) {
    objects = renderCue[ type ][ color ];
    numObjects = objects.length;
    while( numObjects-- ) {
      if( objects[ numObjects ].visible ) {
        objects[ numObjects ].draw();
      }
    }
  }
}

// draw all custom objects, using blend
glowContext.enableBlend( true );
c = customRenderCue.length;
while( c-- ) {
  if( customRenderCue[ c ].visible ) {
    customRenderCue[ c ].draw();
  }
}
glowContext.enableBlend( false );

// done!

There are no post-effects or anything, just plain simple… rendering. One of the real benefits of using GLOW is that your render loop becomes very readable.

The Browse Renderer – Data

The idea that landed that first day, spawned from how Lego-bricks are built around a very simple (genius) pattern. A brick has eight sides (only seven visible in Build with Chrome), a position, size and number of pegs that varies in discrete steps – we call them Lego Units (LU). A brick can only be rotated in steps of 90 degrees, which we don’t do in Build with Chrome – we simply switch width and depth. To describe a brick you simply need…

  • Position: X, Y, Z in LU
  • Size: Width, height, depth in LU
  • Color

Each building is placed on a baseplate, so the position for a single brick can be described in local baseplate coordinates and because no baseplate is bigger than 256 LU (it’s actually 32×32), each element of the position only needs an unsigned byte. This goes for the size as well. And to make things even better, there are a limited amount of types of bricks so width, height and depth can all be described with a type index, which fits in a byte. Finally, because there are a limited amount of Lego-colors, the color can be described as an index, which fits within a byte, too. So in the end, you only need 5 bytes to describe one brick:

Position X, Y, Z in LU, Type and Color.

There are two huge upsides to this extremely compressed format: you can package and compress the data using PNG-images for fast data transfer (i think most buildings fit within a 50×50 pixels image).

You can (with some effort, admittedly) convert this into a single WebGL vec4 attribute and generate the geometry inside of a vertex shader, making it possible to render thousands of bricks in one single draw call.

Browse Renderer – Deferred Rendering, Part 1

To add even more complexity, the initial designs included some depth of field (DoF), vignetting and there was talk about using screen space ambient occlusion (SSAO) to get some kind of shadowing. SSAO didn’t make it in, due to lack of time and depth information (more about that soon).

Early on we had to make a decision: to go with a deferred approach or not. The cons were quite few – the obvious is lack of anti-aliasing, possibly multiple render passes and, it would turn out, some data problems. The pros includes less lighting calculations and simplified shaders, meaning more speed, which was a high priority. We went deferred.

We managed to find the holy grail of deferred rendering in WebGL – how to cope with a single rendering pass without having multiple render targets (MRT). The shaders simply outputs…

R = depth
G = (diffuse) color index
B = screen space normal X
A = screen space normal Y

This all works incredibly well with floating point textures (we could have gotten SSAO to work with this, if we’d gotten the production time) and works quite OK with normal unsigned byte textures. 8 bit precision on the depth doesn’t work for SSAO but is good enough for DoF. The live version of Build with Chrome uses unsigned byte textures. This could be updated so machines with support for floating point textures gets SSAO, for example.

The data problem mentioned above, comes from having to deal with color as an index. All textures need to be converted to index textures and all use the same palette. In the expand shaders (more about that soon) we simply use the index as an UV-coordinate and sample the actual color from the palette texture. The palette texture is one pixel high and 256 pixels wide.

Browse Renderer – Deferred Rendering, Part 2

Because there was DoF in the design this means the post shader, which is responsible for putting out the final picture, had to sample a lot of other fragments for each fragment it put out (the same goes for SSAO and FXAA, the anti-aliasing technique we use).

Because the deferred shaders put out such limited information and there’s need to process this data before you can actually use it in a meaningful way, we invented something we call expand shaders. There are three expand shaders, which all are very simple in themselves, that converts the deferred buffer into three buffers containing:

  • Camera relative position of a pixel
  • Normal of a pixel
  • Color of a pixel

Or at least in theory. In practice we didn’t need all this information and removed some of this, to optimize. Again, this works amazingly good with floating point textures and OK with unsigned byte textures (the position buffer is very approximate, to say the least). These three buffers are then sent to the post shader, which does all the compositing and lighting.

Browse Renderer – Deferred Rendering, Part 3

This will mainly be about the pegs, but first a little about light. We only use directional lights and we solved this using a separate light pass. It’s a simple shader that renders what could be described as a camera projected, lit ball to a texture (there’s no geometry, but the result looks like a lit ball plus some). This texture then simply becomes a lookup table for the screen space normal that the deferred shader put out. This means we can have almost as many lights we like, with very little per-pixel cost.

Ok, the pegs. From day one we knew that real geometry wasn’t an option – it’s simply too much to calculate and draw. So we pulled a trick that we came to call ”camera mapping”. Not sure it’s the right terminology. Anyway, what we do is we render a single, high resolution peg to a texture, using an orthographic camera. (Side note: we only render the screen space normals (BA-channel). The other two channels are used to create edges that mark the tiny gap between the bricks, but that’s another story).

To get the mapping right, you use a screen projected version of the UV-coordinate. In Build with Chrome, this projection is actually done in JavaScript as there’s only four UV coordinates in the entire world.

Now, that might sound simple but when you have bricks that have more than one peg (most of them), you need to get the UV-wrapping to work for you as the top side is just two triangles, no matter how many pegs it has. This took some time to figure out, but in the middle of the night, it all came to me. What you need to do is to “unwrap” it…

The resulting buffer looks really weird, but because the process is reversed on screen it all looks dandy in the end.

There are some real limitations to the camera mapping technique. First you can’t look towards the horizon, as the pegs have no height. Secondly, pegs in the edges of the screen becomes slightly distorted. This explains the tight perspective the browse mode has. I think we can push it and make a more free browse camera, but we decided to stay on the safe side right now. Hopefully we’ll get the chance to improve this later on.

Wrap Up

I found two interesting/annoying bugs during the development:

1. If you have an FBO that uses a depth buffer, you also need to have the stencil buffer attached or the depth buffer will fail on certain Macs. Probably there’s no performance loss if you have stencil write disabled.
2. On some other Macs, using ints in shaders fail. You have to use floor( theFloatValue ).

This was (and hopefully continues to be) one of the most awesome projects I’ve been part of. Thanks to all wonderful people that was involved!

For you geeks who are really interested in WebGL and like to know more about the details, please don’t hesitate to contact me on twitter.

2012+03+13

A few updates

Today we uploaded a few handy changes to the repo. Among these, we’d say these are the most interesting:

  • GLOW.Float and GLOW.Int now support arrays as constructor parameter. This is very good to have when you have uniform arrays in your shaders.
  • GLOW.Matrix3.extractFromMatrix4() was added so you easily can get the rotation part of the GLOW.Matrix4 into your GLOW.Matrix3.
  • The GLOW.Compiler doesn’t call the GLOW.Texture.init() if there’s no texture there to initialize. This allows you to init your shader without a texture set.
  • We added GLOW.Geometry.Cylinder so you can create cylinders easily.
  • We added interleaved attributes to the cache (I have no idea why we hadn’t this before, but now it’s there :)

Enjoy!

2011+12+27

Refactored .elements

To make it easier and more logical to activate GL.drawArrays (see previous post) we’ve thrown away the good old elements property and changed it into…

var shaderConfig = {
  indices: myArrayOfIndices,
  primitives: GL.TRIANGLES,
  data: {
    // attributes and uniforms
  },
  usage: {
    primitives: GL.DYNAMIC_DRAW
  }
}

(First a side note: elements is now primitives in the usage object). As you can see we’ve split elements into indices and primitives, where…

  • indices is the array of indices
  • primitives is the type of primitive you’d like to use

If you leave out the primitives property, it defaults to GL.TRIANGLE. Also, the good old…

  • triangles
  • triangleStrip
  • triangleFan
  • points
  • lines
  • lineLoop
  • lineStrip

… properties still work and automatically sets the primitive property.

Now, to use GL.drawArrays instead of GL.drawElements you simply leave out the indices property…

var shaderConfig = {
  primitives: GL.POINTS,
  data: {
    // attributes and uniforms
  }
}

As you see, you no longer have to set the amount of primitives to draw (this is calculated from the attribute length).

Thanks Neil Mendoza for the excellent suggestion.
Hope you like it!

2011+12+25

Added support for drawArrays

As Neil Mendoza rightfully pointed out, GLOW only supported drawElements. We’ve now added support for drawArrays. It’s quite simple to do, you just…

var shaderConfig = {
  triangles: 10 * 3
  primitives: GL.POINTS,
  data: {
    // attributes and uniforms
  }
}

var shader = new GLOW.Shader( shaderConfig );
shader.draw();

As you can see the only difference is that you, instead of creating an array of indices, tell the shader the length of the arrays to draw which type of primitive you’d like to draw. As drawArrays doesn’t use indicies like drawElements you’re not limited by the 65535-limit imposed by the Uint16Array used by drawElements.

Another nerdy addition is the GLOW.Elements.offset property that’s been added. This can be used to create animation-like behaviors, moving over the attributes. Please note that you have to keep the GLOW.Elements.length in check, so you don’t run outside the attribute buffers.

We’ve updated the particle thingy we launched some week ago as well as killing some bugs related to cloning textures.

Please pull!

2011+12+12

FBO Simulations

It’s been a while but we’ve just added a new tutorial on how to use FBO and GL.FLOAT for complex shader simulations. Click here to get into the game. And here to see an example.

2011+07+21

GLOW.Texture Changes

We just refactored and simplified the GLOW.Texture. Read all about the new texture capabilities here.

2011+07+15

Interleaved Attributes

GLOW had one feature, the cache, but we just added its second – automatically interleaved attribute data.

Interleaved attributes is a performance improvement, which in some cases can make a big difference but really is a very nerdy feature. It’s not widely supported by existing WebGL frameworks so I’ll try to explain what it is and why you like to have it.

Instead of having one VBO for vertices, another for normals and a third for UV coordinates, you simply interleave them into one VBO. So instead of…

VBO A: Vertex Vertex Vertex...
VBO B: Normal Normal Normal...
VBO C: UV UV UV...

…you get…

VBO: Vertex Normal UV Vertex Normal UV...

The good thing with this is that you only have to bind one buffer instead of one per attribute, when setting up the attributes prior to the draw call.

To get the right data to the right attribute, WebGL uses something called Stride and Offset (or Pointer). You can read all about it here. This is all handled internally by GLOW, and you’ll not even notice unless you use WebGL Inspector and look at the trace.

Most of the time you like this to be enabled and just work but in some cases you might want to control how the attributes are interleaved or not interleaved at all. This is possible through the shader definition object that you pass into the GLOW.Shader constructor…

var shaderInfo = {
  vertexShader: "...vertex shader code...",
  fragmentShader: "...fragment shader code...",
  data: { ...all uniform, attribute and texture data... },
  elements: [ ...UInt16Array containing element info... ],
  interleave: {
    vertices: false,
  }
}

In this example the attribute called ”vertices” won’t be interleaved (and work just as an ordinary GLOW.Attribute) and all other attributes will be automatically interleaved.

var shaderInfo = {
  vertexShader: "...vertex shader code...",
  fragmentShader: "...fragment shader code...",
  data: { ...all uniform, attribute and texture data... },
  elements: [ ...UInt16Array containing element info... ],
  interleave: {
    vertices: false,
    normals: 5,
    uvs: 5,
    speed: 1,
    acceleration: 1
  }
}

In this example the ”vertices” attribute won’t be interleaved, ”normals” and ”uvs” will be interleaved together and ”speed” and ”acceleration” will be interleaved together. The number can be any number you like – think of it as the id of the interleaved VBO you’re creating. If you happen to have even more attributes than the ones controlled in the interleave object, they will be automatically interleaved.

You can even control the usage for interleaved attributes by…

var shaderInfo = {
  vertexShader: "...vertex shader code...",
  fragmentShader: "...fragment shader code...",
  data: { ...all uniform, attribute and texture data... },
  elements: [ ...UInt16Array containing element info... ],
  interleave: {
    vertices: false,
    normals: 5,
    uvs: 5,
    speed: 1,
    acceleration: 1
  },
  usage: {
    normals: GL.DYNAMIC_DRAW,
    uvs: GL.DYNAMIC_DRAW
  }
}

…which will make the ”normals” and ”uvs” buffer use DYNAMIC_DRAW. If you only define usage for one of the attributes in an interleaved buffer, GLOW will throw a warning and default back to STATIC_DRAW.

Last, some details, a new object called GLOW.InterleavedAttributes has been introduced. You cannot use the clone except-property on an interleaved attribute (as it’s now part of a set of interleaved attributes). The interleaved attribute name is generated by the attributes in the set – like ”vertices_uvs_normals” for example, and is accesible directly on the GLOW.Shader like other attributes (and uniforms).

2011+07+03

Usage

Usage in WebGL is used for hinting the GPU how to store data. This parameter is now supported by GLOW.

The GLOW shader data object has a new property, simply called usage…

var shaderData = {
  vertexShader: "...the vertex shader code...",
  fragmentShader: "...the fragment shader code...",
  data: {
    viewMatrix: new GLOW.Matrix4(),
    vertices: myFloat32ArrayWithVertices,
    uvs: myFloat32ArrayWithUVs,
    ...and all other uniforms and attributes...
  }
  usage: {
    vertices: GL.DYNAMIC_DRAW,
    uvs: GL.STREAM_DRAW,
    triangles: GL.DYNAMIC_DRAW
  }
  triangles: myUint16ArrayWithTriangles
}

As you see, I use the global GL object for parameters – make sure that a GLOW.Context has been created before you create the shader data object.

  • GL.DYNAMIC_DRAW is for buffers that are used often and updated often
  • GL.STREAM_DRAW is for buffers that are initialized once and seldom drawn.
  • GL.STATIC_DRAW (default) is for buffers that are initialized once and drawn often.

Note that the elements (in this case triangles) also can be subject to usage. You don’t have to define the usage property and if you do, you only have to set it for data that use GL.STREAM_DRAW or GL.DYNAMIC_DRAW.

2011+07+03

Point, Lines, Triangles

Sometimes you don’t want to draw triangles, but lines and points. Now GLOW has support for this as well as some other types.

As you know, this is the GLOW shader data object…

var shaderData = {
  vertexShader: "...the vertex shader code...",
  fragmentShader: "...the fragment shader code...",
  data: {
    viewMatrix: new GLOW.Matrix4(),
    vertices: myFloat32ArrayWithVertices,
    ...and all other uniforms and attributes...
  }
  elements: myUint16ArrayWithElements
}

As of today the elements property can be switched to any of the following…

  • points
  • lines
  • lineLoop
  • lineStrip
  • triangles
  • triangleStrip
  • triangleFan

…which are the different element types that WebGL supports. Elements is the same as triangles, but we decided to keep it for now. If you’re into details, you can see that the GLOW.Elements have some new parameters – data, type and usage (more about usage in the next post).

2011+06+22

Benefits with GLOW

Several people have asked me why they should use GLOW when there’re a lot more competent libraries out there. It’s a good question, which I’ll try to answer here.

GLOW is a WebGL wrapper, so you’re flying very LOW over the GL – that’s where the name comes from. This also means that GLOW is not for beginners but for you with some experience of WebGL or OpenGL. It also means that it’s not a WebGL 3D framework (like Three.js). It’s just a wrapper.

Given that you have experience, you’re probably familiar with code looking like…

GL.useProgram( program );
GL.uniformMatrix4fv( viewMatrixLoc, false, viewMatrix );
GL.uniformMatrix4fv( projMatrixLoc, false, projMatrix );
GL.uniform1i( sampleLoc, 0 );
GL.activeTexture( GL.TEXTURE0 );
GL.bindTexture( GL.TEXTURE_2D, texture );
GL.enableVertexAttribArray(0);
GL.enableVertexAttribArray(1);
GL.enableVertexAttribArray(2);
GL.bindBuffer( GL.ARRAY_BUFFER, vertices );
GL.vertexAttribPointer( 0, 3, GL.FLOAT, false, 0, 0 );
GL.bindBuffer( GL.ARRAY_BUFFER, uvs );
GL.vertexAttribPointer( 1, 2, GL.FLOAT, false, 0, 0 );
GL.bindBuffer( GL.ARRAY_BUFFER, normals );
GL.vertexAttribPointer( 2, 3, GL.FLOAT, false, 0, 0 );
GL.bindBuffer( GL.ELEMENT_ARRAY_BUFFER, faces );
GL.drawElements( GL.TRIANGLES, 36, GL.UNSIGNED_SHORT, 0 );

This is pretty much the only way to draw something with WebGL – it’s designed this way. GLOW wraps this with a tiny overhead (most of which is the cache) and all you do is…

myShader.draw();

It’s in your shader code the magic happens and we think this is where you should spend your time. The shader data format is similar to the Three.js custom shader format – only slightly simpler, all to get you started within minutes…

var shaderData = {
  vertexShader: "...the vertex shader code...",
  fragmentShader: "...the fragment shader code...",
  data: {
    viewMatrix: new GLOW.Matrix4(),
    vertices: myFloat32ArrayWithVertices,
    ...and all other uniforms and attributes...
  }
  elements: myUint16ArrayWithElements
}

var shader = new GLOW.Shader( shaderData );
shader.draw();

GLOW comes with an extras library including matrices, vectors, hierarchies, geometry parsers and other helpful objects, all (most probably) working out of the box. As GLOW is just a wrapper, it’s compatible with all other WebGL libraries and can be used to extend these in all possible ways.

So, the benefits are:

  • You get close to WebGL without having to deal with the WebGL API
  • It’s very easy to create shaders
  • There are a lot of extras that helps you get going
  • It’s compatible with all other WebGL frameworks

Enjoy!