I've been programming in various languages since childhood. Here you'll find some of the more interesting things I've made, with a heavy bias towards the most recent. I've always loved learning new technologies and sharing that knowledge and excitement. As you can see computer graphics, interactivity and applied mathematics have always been keen interests. I've also come to deeply value architecting code and systems that follow suitable best practice standards and make wise decisions about design patterns.
If you're looking for someone to make interesting things with you, I'm available for hire. Either on a contractual basis or as an employee. I'm currently based in the East of England, am available to relocate for the right opportunity, and always ready to work remotely. The best way to contact me is by email at
Building high quality interactions in VR presents a challenge that I continue to enjoy engaging with. A particularly satisfying solution was constructing and solving some differential equations to find the least noticeable adjustment to a player's throw that would bring the object to a target. The result is almost uncanny, with the throw's trajectory feeling natural but still perfectly hitting the target.
I've also enjoyed firming up my knowledge of the inner workings of rigid body physics engines. A highlight has been using intertia tensors to predict the rotation that will be caused by an off-centre force, allowing for precise interactions rather than relying on more common solutions of the "usually correct, eventually" type.
While working on shaders that simulate reflection and refraction through simple volumes (as I wrote up in an interactive tutorial about tracing rays through a sphere) I realised that just as the surrounding scene can be represented in a cubemap, the shape of the object itself could be too.
By building a tool to generate the cubemap representation within Unity it was easy to incorporate into a standard asset pipeline. Pleased with how well it worked I extended the concept by also representing the internal volume of a hollow container. I could then partially fill that interior with liquid and give it a very simple physics based animation. For a bit of extra fun I added a good enough approximation of the meniscus effect on the water's surface with some
The end result achieves good performance, maintaining over 100FPS on a mid range desktop computer even with supersampling active. The whole thing is especially effective in VR with the combination of stereo vision and ease of manipulating the container.
I created a simulation of a sandbox-like landscape represented by layered heightmaps. It includes water and sand flow, water being absorbed into sand, hydraulic erosion and transport of sand and rock material, as well as various user controlled tools to interact with the system. I was especially keen to make the tools feel like they are part of the world rather than abstract "brushes", increasing the tactile sense of presence when interacting in VR.
A carefully orchestrated pile of compute shaders and management of data flow to and from the graphics card (basically keeping all state in GPU memory) gives good performance even at larger landscape sizes and with the additional overheads of targeting VR.
Finding that Unity's universal render pipeline doesn't yet have built-in support for decals I set about creating an open source tool for static decal generation. A priority was to make the tool as easy to use as possible. Alongside extensive documentation the in-editor displays are fully integrated into Unity and designed to notice common problems to which it suggests solutions. The shaders are made with Unity's shader graph system to empower users to customise them.
Some of the early adopters approached the tool looking to create highly dynamic decals. This led to writing an interactive article exploring some of the different ways to render decals and their suitability for certain tasks.
I've provided contributions and code reviews for the open source tool WoWAnalyzer, a web app that performs in-depth analysis of combat for the online game World of Warcraft. The app provides fully automated analysis of often complex encounters, turning raw data into actionable and well presented advice for the player.
Back when Flash was a going concern I developed a renderer for the low-level Stage3D API. It didn't do anything novel but I learnt a lot from getting my hands dirty with implementing lighting, shadowing, batching, and so on from scratch. The shaders were written in AGAL—a shader assembly language—which was an interesting learning experience in itself, although I wouldn't recommend it for rapid prototyping (here's a heavily commented particle shader from a tutorial I wrote at the time.)
I solo-developed a number of projects in Flash. Covering the complete spectrum of human experience: shooty-bang games, tricks of perspective, and oh-so-earnest game jam entries. None are particularly great works of art, but they taught me a great deal about building and releasing complete working projects.