I must confess that my gaming days are largely behind me - while I do dabble occasionally - but I certainly continue to keep tabs on the latest gaming developments. The recent Unreal 5 Engine Reveal - which demonstrates the latest iteration of that forthcoming graphics and sound engine running on the forthcoming PlayStation 5 platform is just stunning.
You can see how much incredible effort has been put into getting everything to behave more fluidly and naturally - from the look and function and feel of surfaces - to object interaction, light and motion reaction and all-immersive surround-sound. We’re also talking about super-high poly asset count - all those 3D polygonal objects rendered as billions of tiny vectorised triangles.
As teased in the title - this doesn’t just offer next-level realism and fluidity to games, but also for film and TV special effects - and even whole televisual and cinematic sequences. I foresee a time when actors simply communicate a set number of movements, gestures, vocalisations and idiosyncratic behaviours and the AI processes it all into fully-featured film sequences - with full dialogue and interactions. No need for weekly re-shoots and casts of 100’s to attend laborious sessions - instead you just set the engine to remodel it all in accordance with the new storyboard. That’s something of a futuristic extrapolation - but those days aren’t that far ahead.
The level of special effects and immersion in the best cinematic releases is already stellar - but those frame-by-frame methodologies are time-consuming, laborious and expensive. With increased AI and automation - and much improved natural smarts in these gaming engines - they can take on a huge range of tasks. I can see very high-resolution animations just being done directly and entirely on this essentially gaming software. And I can see certain trickier action and environmental scenes for film and television being reproduced entirely in this digital medium.
The Unreal Engine uses a number of smart component elements - Chaos Physics System, Convolution Reverb, Lumen Dynamic Global Lighting, Nanite Pixel Manipulation and Niagara Contextualized Object Movement and Interaction. The realm of real-time dynamic automated graphics manipulation is ascending to the next level - and I’m very excited by the potential here. Developers always take the technology to the next level and typically go way beyond what was intended - so it’s thrilling to see what will be created with this new system.
This will very likely wholly change the face of immersive entertainment - and be reflected in every day interactions for mobile phones, Netflix etc. The possibilities here are limitless! PlayStation 5 was due for release at the end of the year - while that will likely shift to 2021 - we’ll have to wait and see. We’ll need to wait until partway through 2021 in any case for the release of the Unreal Engine 5.
Here follow some of the key terms / elements / components employed :
And here is the exceptional demo, running on the forthcoming PlayStation 5 as mentioned :
Meetings:
Google Meet and Zoom
Venue:
Soho House, Soho Works +
Registered Office:
55 Bathurst Mews
London, UK
W2 2SB
© Affino 2024