I have been working through LearnOpenGL lately because my graphics programming knowledge lags behind. Last time I did graphics, it was all fixed-function pipeline.

The problem

In order to use OpenGL, it is generally recommended to use an OpenGL loading library. These libraries facilitate requesting the gl* function pointers to the functions you might use, depending on what version and extensions of OpenGL you use.

I am not a huge fan of this because it feels like it shifts the complexity onto the users of OpenGL. When designing APIs, it's good practice in my mind to try not to make things which spread complexity. I understand the desire for loaders because they facilitate creation of drivers with only partial OpenGL support, but it does make things harder for the end-user of the API.

Solutions I don't like

LearnOpenGL chose GLAD as their OpenGL loader. GLAD is a code-generator written in Python which generates OpenGL header/source files for your chosen version/extensions of OpenGL. There were several reasons why I was quickly put off by GLAD:
• Website interface. In order to create your configuration, the recommended workflow is to go to the website (linked previously) and select your choices via drop-down boxes. You can then download the generated files as a zip file. I wasn't a fan of this because it creates a manual step that relies on a 3rd-party service (the website) still being hosted
• Python. Cakelisp is meant to be a low- to no-dependency download where all you need is a C++ toolchain. If I wanted to avoid using GLAD's web interface and instead run the Python scripts offline, now I need to add Python as a dependency to GameLib. Python usually leads to use of PIP, which in my experience leads to very fragile and bloated project structures, as well as a poor Windows experience

Surprisingly, many of the other loaders also weren't sufficient for my use case due to their being written in Python, Perl, etc. That rules out the following loaders:
• GLEW: Written in C, but requires Perl and a Unix environment if you want to generate new configurations. Windows support is important to me, so that's a no-go
• GL3W: Written in Python
• glatter: Written in Python
• glbinding: Written in Python

Note that while these loaders can be used without using Python/Perl/etc., I want to be able to generate the actual header/source files from scratch if necessary.

The solution I like

I continued through the list of loading libraries until I found Galogen (GitHub).

Galogen strikes the perfect balance between flexibility and sane implementation to me:
• It's written in C++, so any Cakelisp environment will easily handle it
• It's only two C++ files, which seems like a perfectly reasonable amount of code for the task. GLEW, for comparison, has dozens of files in its repository
• It has an intuitive command-line interface

Both a vulnerability and a possible feature is that the repository hasn't been touched in several years. This means I'm on my own supporting it, but it also means it probably hasn't needed to be updated.

Cakelisp's compile-time library makes running child processes very easy. This means I can both compile Galogen from source and generate fresh headers during Cakelisp's compile-time stage, which occurs right before building the final project. For example, here's the code I use to generate the configuration for GameLib:
 1 2 3 4 5 6 7  (run-process-sequential-or (galogen-executable-path gl-specification "--api" "gl" "--ver" "4.6" "--profile" "core" "--filename" gl-generated-output-path) (Log "error: failed to generate gl headers via galogen\n") (return false)) 

If the process fails for whatever reason, Cakelisp will print a relevant error and halt the build process.

This is really exciting to me, because this setup is implemented in the same language, in the same file.

Currently, I rely on the XML specification included with Galogen. If I wanted to make the system support cutting-edge OpenGL, I could build CURL and download the latest specification, still during project compile-time.

Another, more Cakelisp-y solution

Cakelisp offers several options for generating and inspecting code written in Cakelisp. For example, we could write all OpenGL calls like this[1]:
 1 (gl BindBuffer GL_ARRAY_BUFFER vertex-buffer-id) 

The space between gl and BindBuffer would allow for defining gl as a macro:

  1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17  (defmacro gl (function-name symbol &rest &optional arguments any) (var function-name-str (* (const char)) (call-on c_str (field function-name contents))) ;; Store a list of all the gl functions we actually use (get-or-create-comptime-var used-opengl-functions (<> (in std map) (in std string) int)) (set (at function-name-str used-opengl-functions) 1) ;; Generate the function name (var full-function-name-token Token (deref function-name)) (var full-function-name ([] 128 char) (array 0)) (PrintfBuffer full-function-name "gl%s" function-name-str) (set (field full-function-name-token contents) full-function-name) ;; Output the actual C function invocation (tokenize-push output (call (token-splice-addr full-function-name-token) (token-splice-rest arguments tokens))) (return true)) 

At this point, we now have our gl* invocations generated, but we still need the loader to create a header with the function signatures. The variable used-opengl-functions is a sorted and unique tree of all OpenGL functions our project actually uses.

We can generate the required header as part of Cakelisp's post-refrences-resolved phase, which is before building but after most code has been parsed. This is the only phase where code generation is possible. The generator might look something like this pseudo-code:

  1 2 3 4 5 6 7 8 9 10 11  (defun-comptime generate-gl-loader () (load-opengl-xml-spec) ;; This would be a separate comptime function handling this (get-or-create-comptime-var used-opengl-functions (<> (in std map) (in std string) int)) (var generated-signatures-tokens (<> (in std vector) Token)) (each-in-iterable used-opengl-functions current-function (opengl-generate-signature-tokens generated-signatures-tokens current-function)) (unless (evaluate-tokens generated-signatures-tokens) (return false)) (return true)) 

This code would come out much larger in reality in order to handle reading the specification and generating other necessary boilerplate. However, it shows the huge power Cakelisp provides by offering compile-time code parsing and generation, all written in Cakelisp alongside your project's runtime code.

Conclusion

I am not planning on implementing the 100% Cakelisp solution for the time being. It would take a decent amount of time compared to just using Galogen.

However, I hope it gives you another example of why I think Cakelisp is awesome. All of that extra tooling can be removed when you have a programming language with full-power compile-time code generation and execution (like Cakelisp).

[1] You may not like the macro solution here. For example, you might not like how the space between gl and the rest of the function name impairs text-search, find references tooling, etc. You could instead use a compile-time function executed during post-references-resolved phase to scan every function for any invocations starting with gl and build the required list that way. It would be a O(n) scan over potentially large bodies of code, which is where the macro solution starts to look more appealing.
There were several cleanup and features added to Cakelisp since the last post:

Precompiled headers for comptime compilation is now merged to master. I put in a lot of work to get them working on Windows as well, though I didn't see the performance improvements I was hoping for on that platform.

tokenize-push was rewritten. This generator is the primary way to populate new Token arrays, and is the foundation of all macros in Cakelisp. For example:
 1 2 3 4 5 (defmacro array-size (array-token symbol) (tokenize-push output (/ (sizeof (token-splice array-token)) (sizeof (at 0 (token-splice array-token))))) (return true)) 

This example macro uses tokenize-push to add an array-size function to the output array. This is approximately like the C preprocessor except that tokenize-push works on the token level rather than the individual character level.

My first implementation of tokenize-push was dreadful, but got the job done at the time. It would generate code to create the tokens which looked like this:
  1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 std::vector& outputEvalHandle_0 = output; if (!tokenizeLinePrintError("(/ (sizeof ", "Dependencies/cakelisp/runtime/Macros.cake", 9, outputEvalHandle_0)) { return false; } PushBackTokenExpression(outputEvalHandle_0, arrayToken); if (!tokenizeLinePrintError(") (sizeof (at 0 ", "Dependencies/cakelisp/runtime/Macros.cake", 9, outputEvalHandle_0)) { return false; } PushBackTokenExpression(outputEvalHandle_0, arrayToken); if (!tokenizeLinePrintError(")))", "Dependencies/cakelisp/runtime/Macros.cake", 10, outputEvalHandle_0)) { return false; } return true; 

As you can see, this calls the tokenizer on a static string passed in as an argument, which means the tokens that were already parsed when the macro definition was originally evaluated must be re-tokenized and allocated.

This rewrite accomplished many goals by solving all the limitations of tokenize-push. The new version reuses the tokens already in memory to output tokens, which gives the following benefits:
• Token sources (line and column numbers) are more accurate
• Macros are unlimited in length (previously, there was a max of 1024 unspliced characters, which I ended up hitting)
• Strings no longer need extra delimiting, which was a source of frustration and bugs
• Performance is better for several reasons: no more tokenization from strings, greatly reduced macro compilation time (fewer symbols in each macro's generated .cpp file)

It works by saving a pointer to the tokens in the ObjectDefinition. This pointer is later retrieved by the macro at runtime via the definition name and tokens CRC. I decided to use a CRC because it was the only stable identifier I could think of. The following were considered and rejected as identifiers:
• File/line number. This causes unnecessary recompilation if the file the macro definition is in is modified
• Incrementing counter. This doesn't work because macros within macros can cause non-sequential incrementation due to unpredictable macro resolve/evaluates

While this system is better in every other way, it does require all macros be evaluated each time cakelisp is run in order to have the tokens loaded. This may be an issue if incremental compilation is ever introduced, because now the macro's tokens need to be loaded by some separate system. The old tokenize-push generated source files were completely self-contained.

Here's what the generated code looks like now:
  1 2 3 4 5 6 7 8 9 10 11 12 std::vector& outputEvalHandle_0 = output; { TokenizePushContext spliceContext; TokenizePushSpliceTokenExpression(&spliceContext, arrayToken); TokenizePushSpliceTokenExpression(&spliceContext, arrayToken); if (!TokenizePushExecute(environment, "array-size", 1677447134, &spliceContext, outputEvalHandle_0)) { return false; } } return true; 

TokenizePushExecute() traverses the tokenize-push token list stored under the CRC 1677447134 on the "array-size" definition and pops expressions from the spliceContext based on the interpretation.

Cakelisp can now be executed within another Cakelisp's comptime.

This was a feature I picked up from Jonathan Blow's language, specifically this video, if I recall correctly. Previously, my "RunTests" file, which tests various features of the language, would run Cakelisp in separate subprocesses. This had the drawback that I couldn't easily attach a debugger or run valgrind on individual tests, nor test the overall memory of the system.

Cakelisp is already being exposed to the comptime functions, so I moved some functions around so that creating a new sub-environment and evaluating Cakelisp within that environment was possible.

File Helper

The project I'm working on next using Cakelisp is a file manager application with an interface based on Emacs' find-file. I focused on getting the directory browsing feeling fast first, and planned the next versions. I also did some research on the important file-size visualization techniques which will be File Helper's X-factor.

Unexpected benefits of Cakelisp

I included Cakelisp prominently on my résumé, which got a lot of interest and questions during a recent job interview I did. I think it was a great project to show my skills and interest in programming. It helps my résumé stand out from the rest.
I am happy to be a new part of Handmade Network!

Progress since submission

There have been several things I've been working on since I submitted Cakelisp to the network:

Kitty Gridlock completion
Kitty Gridlock is a game I made in about 3 weeks (off-hours, weekends primarily) for my girlfriend. I had a deadline to deliver (her birthday), so I had a good goal and could limit scope effectively.

I wanted to prove to myself that I could be productive in Cakelisp in its current state, i.e., without making any modifications to the language. I was absolutely successful in this goal: I got the game done faster than expected, without having to make changes to Cakelisp.

Thanks to SDL's portability features, I was able to port Kitty Gridlock to the target platform, Android, in a matter of days. The use of C++ as the output language was a definite boon to this, because it naturally meant support for the Android NDK environment.

I'm no artist, but I was quite pleased with the end look of Kitty Gridlock, which I drew for my girlfriend to maximize cuteness:

It turns out having a target audience of a single person really narrows what styles I needed to consider :).

Cakelisp was an improvement over my previous development environment (C++ with Jam and shell for building) in several ways:
• I used compile-time code execution to extract and bin (convert to binary) the puzzle database when necessary
• Compile-time code execution copied generated source files over to the required Android JNI folder, further easing Android builds and making them more reliably up-to-date
• Generators allowed me to work around Cakelisp's missing features. I've now added many of the generators into C and C++ helper libraries for future projects to leverage
• I was much more satisfied and excited to work on Kitty Gridlock because I enjoyed the environment more. Developer satisfaction is important!

I plan on writing a more detailed blog post on Kitty Gridlock's implementation soon.

The code is here. Kitty Gridlock cloc'ed in at about 1,400 lines of Cakelisp, not including GameLib.

GameLib module reorganization
GameLib is a library which provides a variety of 3rd party dependencies via Cakelisp modules. Several problems encountered in Kitty Gridlock have revealed problems in GameLib, many of which were easily fixed.

GameLib was updated to not include any 3rd-party dependencies by default. The dependencies will be downloaded to a folder as soon as you import the dependency module. Previously, GameLib included all 3rd party libraries as git submodules.

This change allows me to continue adding useful tools to GameLib without having it balloon into hour-long submodule downloading. When I create a new project, only the modules I use will be downloaded to that project's directory.

I made this change because my latest project, Kitty Gridlock, did not use Ogre, but had to spend the ~30 minutes to download and build Ogre for no reason. This new strategy almost feels more like a ultra-lightweight package manager, because I feel like I have easy access to 3rd party libraries but only pull the ones I need.

At this point I've added the following 3rd party libraries:
• Aubio
• dear ImGui
• SDL 2
• Ogre3D (v. 2, aka ogre-next)
• tracy profiler

GameLib has been an important addition to my workflow because I always felt restrained by having to setup dependencies by hand on every new project. Thanks to Cakelisp's seamless compile-time code execution, I can download, build, and link to 3rd party dependencies simply by importing the Cakelisp module for that dependency.

The best part to me is there is no magical executable that is handling this operation: it's all easily readable in Cakelisp, and built into GameLib. It feels magical without being too magical, which was important to me.

Future plans

I've made good progress on the precompiled headers feature. It nearly halves the amount of time needed to build all of Cakelisp's tests. I need to refactor the command generation code in order to cleanly port precompiled headers from compile-time to build-time, and I haven't done native Windows MSVC precompiled headers yet.

I want to stay true to my goal of using non-language-related projects to drive Cakelisp's development. I have two project ideas currently vying for my next project slot: a game prototype and a file system productivity application. I like to let things soak a bit before committing to make sure I remain interested in the idea. I'm in that phase right now.

Conclusion

I think the Handmade influence should be clear from reading my goals and implementation of Cakelisp. I'm looking forwards to checking out more Handmade projects and getting inspiration from all of you.