As the requirements for graphically rendered applications increase the demands on today’s hardware device, Nvidia is choosing to push the most processor intensive rendering steps into the cloud.
The idea behind this new experimental rendering process is to bypass processor-heavy lighting systems on client machines, perform the algorithms and processes off-site, and then snip the compressed information back to the client machine to render the new lighting maps all in real-time.
Why The Cloud?
Rendering high-quality graphics on a computer system, regardless of type or specifications, is a processor intensive activity. The requirements for the highest-level graphical rendering are only attainable with expensive hardware on specifically designed systems. However, today’s market trends are leading to smaller and more inexpensive systems that included smartphones and tablet computers.
This change in hardware buying patterns begins to lessen the number of consumers who have access to the best and most stunning graphically rendered applications. If Nvidia is able to harness cloud computing to render lighting in a fast and reliable way, then lower-power systems will be able to experience much higher quality graphic applications despite their lower hardware specifications.
Three primary algorithms
There are three main algorithms used by Nvidia to render light: voxel, irradiance map, and photon. Each of these algorithms is designed to be used for certain applications and have benefits and drawbacks.
The voxel algorithms in this new cloud based rendering system are designed to be used on smaller devices, such as tablets. The algorithm is processed offline or dynamically. The results are encoded using H.264, which is a standard of video compression also known as MPEG-4, to save bandwidth, and then decoded on the original source device.
Irradiance mapping requires more client-side processing power that requires only part of the lighting system to be performed in the cloud. All indirect light is calculated and encoded in the cloud, and the new indirect light packets are sent back to the client. The client machine processes direct light and then combines it with the indirect light that was calculated in the cloud. This split use of cloud and client-side rendering is designed or best performed on notebook sized systems and up.
The final system is photons. This system uses a cloud based ray tracer that traces photons into a photon map. The photon map is then sent back to the client machine using a bit-packed encoding system. Old photos expire and are then replaced with the new cloud-mapped photos. The client side system renders the direct light and then computes the combination of indirect light and direct light. This type of rendering requires a very powerful system and hardware to accomplish.