GUIs quirks and quacks


In the search of a ready-to-use GUI system, I came across several options, but in the end I have decided to make my own, simply because none of the options was completely fittable to my needs, specifically for the next reasons:

Maybe one of the most famous options. Looks very good and has official Python bindings.
I never managed to make it work in either Linux nor Windows. In Linux, after four hours of compiling, and a previously failed process due to missing dependencies, I gave up.
In Windows, the compiling took only two hours, only after I fixed some errors thanks to this instructions:

After finishing, I realized that the python bindings were not selected, so I started again, just to found severals errors raised for the lack of Boost. Unfortunately, I neither managed to compile the Python bindings for Boost.


Is recommended, looks promising and it appears to be usable from Python, but… It needs boost, so no further research on that.

Maybe not very well known like the others, but they are completely Python based, so no need for Boost. I tried to convert both to plain PyOpenGL, since SimplUI uses Pyglet’s event system (very different to mine) and BGEUI is based on Blender Game Engine and Fixed function (wich I won’t use), but after some days I realized that the effort needed for a conversion-integration into my engine would end up taking too much time. Maybe near to the time needed to make my own. So I gave up with pre-made solutions.


Result, after three days of work, the first screenshot of the first ‘Panel control’ of my GUI:


Right now, the widgets support opacity and are stretched automatically. In the picture seen over a 3D duck with specular reflection. I have already one directional light. Current cost of the panel: 1ms

It only uses four vertices in a VBO and an index buffer. Then 3 unifforms for each control’s position/size, color and background image. This setup establish the possibility to render the GUI controls either as normal 2D ‘widgets’ or as 3D objects attached to the scene models and, with a little more work, controls could be rendered using instancing where available (all the GUI in one draw call).

and the gestation continues.