January 7, 2020
0 min
No items found.

The Rise of Real Time Rendering


Traditionally 3D rendering has been considered something that always took a long time. You'd build your scene, set your render running and find something to do for the next few hours (and pray your machine didn’t crash before it finished). If you were a movie studio you would link large banks of computers together into a render farm to get everything rendered on time. Needless to say this made 3D graphics expensive to produce but it was also difficult to predict exactly what you were going to get until your render finished.


The reason it took so long is that in order to render a 3D scene you need to fire a virtual ray of light for every pixel you intend to render and bounce it around the scene. This is called 'ray tracing' and if you wanted nice looking images, this is what you had to use.


In parallel to this, games and other real time applications render their scenes using a different approach called 'rasterization'. Instead of accurately modelling rays of light, rasterization uses a collection of approximations that people have discovered to fake lighting the best it can in 1/60th of a second.


Over the last few years, the gap between ray tracing and rasterization has closed to the point where it can be hard to tell the difference between the two. Additionally, the Computer Graphics (CG) industry has adopted a standard approach to lighting and materials called 'Physically Based Rendering', or PBR for short. Originally developed at Pixar, the aim of PBR is to base the lighting equations on real world physics. This provides artists with a simple set of parameters that work well in many scenes rather than constantly guessing and tweaking sliders. PBR applies to both rasterization and ray tracing, allowing scenes to be rendered using either method with minimal changes.At Parallax this has meant that we've been able to produce 3D renders and animations for our clients where it wouldn't have been viable before. Using Blender's realtime renderer, ‘Eevee’, we can test ideas and iterate extremely quickly. Being able to instantly see changes to your work hugely improves the ability to be creative and experiment without fear.The additional benefit to creating these assets in a realtime engine means we can export them for use as interactive elements on web pages using WebGL. AR applications such Apple’s Quick Look also use PBR to help match the lighting to real environments.

The Future

Progress never stands still of course. Nvidia’s recently released RTX technology greatly speeds up raytracing, blurring the lines between the two methods further. Epic’s Project Spotlight uses the Unreal engine to replace greenscreens with realtime skyboxes that directors can change on a whim. VR tools like Quill are gaining traction as a method for artists to animate using intuitive gestures rather than needing to learn complex software interfaces. Exciting times ahead!