Film and TV

Game of Thrones
Game of Thrones

On Game of Thrones (Season 8), I designed and produced virtual production tools, mainly utilising VR, for previsualisation and shot-planning, used for the more complex scenes, the primary tool being Pathfinder , which allows you to both both scout locations and to plan camera shots.

Epic Games has created a excellent write-up of our work which you can read here:
Game of Thrones Previs using Unreal Engine

The Mandolorian (Season 2)
The Mandolorian (Season 2)

The Mandolorian TV series used substantial amounts of virtual production in the development. My role was to working as a Pipeline TD, to develop tools and features for Unreal Engine and it's use for Mando VP.

In particular, a focus on C++ support and expansion of the Python API to integrate core Maya tools into the new Unreal Engine workflow.

Avengers 2
Avengers 2

On Avengers 2 I was the P-Cap operator for a capture stage in Shepperton Studios, where I captured principal cast and stunt-work.

Mowgli
Mowgli

Mowgli made heavy use of Unreal Engine for both previsualisation and realtime display of performance capture. I created digital sets, mapped digital avatars to performers, and produce rushes of the shoots. This was performed both in studio and on location.

Final Fantasy - Kingsglaive
Final Fantasy - Kingsglaive

On Kingsglaive I alternated between work as a performance capture operator, and as the realtime operator using motion builder.

Star Wars - Episode 7
Star Wars - Episode 7

For Star Wars I worked in studio and on location as motion capture technician, and I also worked on location as IBC capture operator

The Ritual
The Ritual

For the Ritual I used Unreal Engine to build previs scenes to help plan the capture of more difficult shots, these were then used with performance capture digital sets for character motion development for the digital avatar of the monster (All in Unreal).

Dawn Of The Planet Of The Apes
Dawn Of The Planet Of The Apes

For Apes 2, my work was based around the various pick-ups in performance capture for Andy Serkis.

Jingle Jangle
Jingle Jangle

On-Set NCam Realtime op

Video Games

Star Citizen - Squadren 42
Star Citizen - Squadren 42

Star Citizen's Squadren 42 was a very large project for me, taking place over two years. My role was to drive and manage the realtime visualisation. Since this was (at the time) a CryEngine game, the realtime was all performed in CryEngine, in the actual game client. This allowed our realtime to closely reflect the end-result.

Planet of the Apes - Last Frontier
Planet of the Apes - Last Frontier

On the last frontier I created the inital pitchvis in Unreal Engine, which was a complete scene with lighting, motion capture, and characters. Later in development I also helped with marketing material, in particular R&D into new mediums of delivery, such as 360 and VR.

Battlefield 1
Battlefield 1

Performance Capture, data cleanup and pipeline development

Ryse - Son of Rome
Ryse - Son of Rome

As my first commercial project, I worked on stage to capture data, perfrom the post cleanup, and also developed new scripts to aid in future data-prep and cleanup.

As my first project I was also just getting up to speed with all the techniques and pipelines

Mad Factory VR
Mad Factory VR

Mad Factory VR is a fantastic VR game and my first shipped VR title. I assisted the immsersive team in the game logic and implementation.

Other

...
The Tempest

The Tempest was a massive project for me where performance capture was merged with traditional stage show control to produce a realtime avatar, driven live on stage, and controlled through external tools over network languages, mainly DMX, OSC, and PSN

I created the Unreal project and merged all the elements together, including all the state logic, visual changes, and materials and particle fx.

...
Coldplay - Adventures of a Lifetime

Performance Capture

...
Dream

Dream Website

Role: Lead Developer

Dream is an online production performed live at Portsmouth Guildhall in March 2021. The production combined the latest gaming and theatre technology to create a shared experience between audiences and actors, and featured an interactive symphonic soundtrack.

...
Dream AR

Role: Lead Developer

Dream AR was a Magic Leap powered project that explored a small part of Puck's journey through the forest. Making use of Argmented Reality to infuse the magic of the forest into the players space, the user uses simple interactions to guide Puck through the forest.

Short duration single player experience for use as a LBE

...
Live Broadcast Event. Ongoing (More info coming)

Integration of an external softwares data into Unreal Engine, and it's integration into a Disguise server workflow, with dynamic plate switching