On Game of Thrones (Season 8), I designed and produced virtual production tools, mainly utilising VR, for previsualisation and shot-planning, used for the more complex scenes, the primary tool being Pathfinder , which allows you to both both scout locations and to plan camera shots.
Epic Games has created a excellent write-up of our work which you can read here:
The Mandolorian TV series used substantial amounts of virtual production in the development. My role was to working as a Pipeline TD, to develop tools and features for Unreal Engine and it's use for Mando VP.
In particular, a focus on C++ support and expansion of the Python API to integrate core Maya tools into the new Unreal Engine workflow.
On Avengers 2 I was the P-Cap operator for a capture stage in Shepperton Studios, where I captured principal cast and stunt-work.
Mowgli made heavy use of Unreal Engine for both previsualisation and realtime display of performance capture. I created digital sets, mapped digital avatars to performers, and produce rushes of the shoots. This was performed both in studio and on location.
On Kingsglaive I alternated between work as a performance capture operator, and as the realtime operator using motion builder.
For Star Wars I worked in studio and on location as motion capture technician, and I also worked on location as IBC capture operator
For the Ritual I used Unreal Engine to build previs scenes to help plan the capture of more difficult shots, these were then used with performance capture digital sets for character motion development for the digital avatar of the monster (All in Unreal).
For Apes 2, my work was based around the various pick-ups in performance capture for Andy Serkis.
On Set Work, alongside the NCam Realtime team, to bring Motion Capture and N-Cam together
I created a system for placing background npcs into any project, inc mocap import and retargetting, seeded randomised characters, outfits and animations, with a focus to bring the digital sets to life. This system has since been reused in other projects and easily adapted to new use cases.
I created a bespoke plugin to ingest data from the specialised sports software (Hawkeye) into Unreal Engine and interact & render correctly and with flexibility within the Disguise server workflow. System built to work with absolute reliability, and adapt as needed for use in live broadcast.
Star Citizen's Squadren 42 was a very large project for me, taking place over two years. My role was to drive and manage the realtime visualisation. Since this was (at the time) a CryEngine game, the realtime was all performed in CryEngine, in the actual game client. This allowed our realtime to closely reflect the end-result.
On the last frontier I created the inital pitchvis in Unreal Engine, which was a complete scene with lighting, motion capture, and characters. Later in development I also helped with marketing material, in particular R&D into new mediums of delivery, such as 360 and VR.
Performance Capture, data cleanup and pipeline development
As my first commercial project, I worked on stage to capture data, perfrom the post cleanup, and also developed new scripts to aid in future data-prep and cleanup.
As my first project I was also just getting up to speed with all the techniques and pipelines
Mad Factory VR is a fun VR game built for an arcade type gameplay. I acted as a Unreal Engine consultant for the project and assisted the immsersive team with the core game logic and VR best-practices.
The Tempest was a massive project for me where performance capture was merged with traditional stage show control to produce a realtime avatar, driven live on stage, and controlled through external tools over network languages, mainly DMX, OSC, and PSN
I created the Unreal project and merged all the elements together, including all the state logic, visual changes, and materials and particle fx.
Performance Capture
Role: Lead Developer
Dream is an online production performed live at Portsmouth Guildhall in March 2021. The production combined the latest gaming and theatre technology to create a shared experience between audiences and actors, and featured an interactive symphonic soundtrack.
Role: Lead Developer
Dream AR was a Magic Leap powered project that explored a small part of Puck's journey through the forest. Making use of Argmented Reality to infuse the magic of the forest into the players space, the user uses simple interactions to guide Puck through the forest.
Short duration single player experience for use as a LBE