The opportunities for people with the skillset to build video games and real-time software are exploding right now. Multiple new, rapidly growing fields such as virtual production, real time VFX, virtual and augmented reality (VR & AR), gamification for training and marketing, synthetic training for AI and more are looking for skilled candidates.
We've provided a guide to emerging fields where game development and real-time software development skills are sought after, as well as notes on how differnet disciplines contribute to the industry in question.
Virtual production involves using real-time 3D software such as Unity and Unreal Engine to render special effects for film and TV instantly, rather than waiting until filming and editing has been completed. This is often combined with techniques such as motion capture, greenscreens or wraparound LED panels and used to give an immediate sense of how an actor's performance will look when combined with the special effects.
'Virtual Preproduction' refers to using these techniques to test and plan how the scenes will be filmed, after which the scenes are recorded and a second, final SFX pass is applied after filming is complete. 'Virtual Production' or 'Real-Time Production' refers to using the output of the real-time compositing step as the final result.
Custom Virtual Reality application development is overwhelmingly accomplished using game engine middleware. While there are a number of no-code, low-code or purpose-built toolsets for VR and AR such as Amazon Sumerian and Zappar's ZapWorks, the vast majority of projects (and jobs) are performed using general purpose game engine stacks such as Unity, Unreal or Godot.
VR and AR apps require much the same skills as traditional games. There is a significant amount of crossover with mobile skillsets. Artists may be called upon either to do very high detail work targeting powerful VR desktop systems or significantly reduced, low-poly work for Augmented Reality platforms including phones and all-in-one systems such as the Microsoft Hololens.
Many digital studios offering work-for-hire VR & AR development services have more work for producers, business developers and account executives as compared to games development firms which might subsist on a single large publishing deal for many years.
Games and interactive experiences have been used to teach for many years. Educators are taking advantage of advances in rendering quality and game engine capabilities to deliver curriculum, build specialised skills and expose students to dangerous or difficult to recreate environments. Mobile and web platforms allow a convenient and fun way to learn languages, study for school or university or pick up vocational skills like programming.
Educational games content takes all of the familiar games industry skills (programming, design, art, audio, narrative, etc) to put together, as well as the frequent need to find education specialists and domain experts. Educational games vary widely in terms of budget and scope. Given the cloud-connected nature of many startups and educational institutions, frontend, backend and full-stack web application development skills are often also in demand with these teams.
Did you know that many AI tools are being trained and validated using images rendered from within game engines? Building 'synthetic environments' using Unity and Unreal Engine has become a popular and cost-effective way to make huge image datasets to train and test computer vision programs. Artists, designers and engineers work to build digital scenes with a carefully calibrated level of visual fidelity, which can simultaneously provide information about the 'ground truth' of where objects of interest are placed relative to the scene, as well as synthetic images of the objects from any camera angle.
A number of industries have begun investing in to game-like pipelines for AI training, the most well known being the development of driverless cars. Consumer retail, aerospace, manufacturing, healthcare and robotics are some of the industry verticals which contain early adopters of this technique.
Digital simulation allows people to make important predictions about the world, and weigh different choices and plans against each other. For example, a real-time simulation of a warehouse might show how fast it would be to access or load a certain amount of goods depending on how shelves are arranged.
The concept of 'Digital Twins' builds on this idea with a feedback loop where relevant sensor data from a real item is fed back in to a digital simulation of that same item, which in turn update the simulation mode. This allows the digital simulation to predict how productive the item is, how worn out it is likely to be, and how long it can continue working before attention is needed to repair or maintain it. Predictive analytics and maintenance have become important tools for keeping critical equiptment working in a seamless manner.
There are many kinds of digital simulations- some are real-time and work in a very similar manner to game engines, some are 'discrete' and only care about certain event barriers being crossed within the simulation rather than computing a running state of the internal world. Some digital twins and simulations will have rich 3D worlds, using the skills of artists and designers, while others may simply output their results to text and forego a visual representation altogether.