Get Your Game On | Next Gen Architectural Visualization

Girl wears Oculus Rift to experience the desing of a virtual building

Posted by Brandon Garrett

There are several current and emerging technologies that are changing the way we visualize architectural designs today. Architectural visualization can range from traditional hand drawings to computer generated (CG) images and animations.

Hand drawn sketch to represent an architectural design.

Over the course of the last decade, computer generated renderings have become commonplace, yet the means and methods of generating renderings has changed drastically. Today's rendered images and animations have a photorealistic quality. This progression is a result of Moore's law, which states that the number of transistors in a dense integrated circuit doubles approximately every two years. Now, because of increased processing power and advanced tool capabilities, renderings mimic actual photos. The resulting medium, however, has not changed and most architectural visualization work results in either a still image or animation. Interestingly, industry analysts predicted that the continued increase in processing power would slow at the end of 2013, when transistor counts and densities would only double every three years. So what's next? Have we reached the pinnacle of architectural visualization? I argue not.

Historically, technological improvements have resulted from industry demand. CG renderings and animations have benefited greatly from the work of animation studios like Weta Digital and Industrial Light & Magic who created films like the Lord of the Rings, Star Wars, and Avatar. Many of these animation studios employ the same software, like 3Dsmax, that Dekker/Perich/Sabatini uses in our architectural projects.

Photorealistic rendering of an architectural design.

Animation studios have taken CG work to new heights, but I believe the future of architectural visualization will benefit from an entirely new industry.

According to Fast Company, a new pop-culture milestone was reached in September 2013, when the fourteenth installment of Grand Theft Auto did $800 million in worldwide sales in only 24 hours. That was the biggest launch day ever for any piece of entertainment--any movie, any record, anything at all. In fact, the video game industry is now a multibillion dollar industry that rivals film industry sales.

The technological backbone of any video game is its engine: the platform for visualizing the game. It controls all of the interactivity, artificial intelligence, and aesthetic of the game. Most of these game engines are proprietary but some are available for use by the public. The most popular game engines are Unity 3D, Unreal Development Kit (UDK), Cry Engine, and Source.

So how does a game engine differ from software we use in our architecture firm, applications like Revit and 3Dsmax? Traditional applications must process how light (artificial or sunlight) reacts to a modeled environment. This calculation can take several minutes, hours, or even days depending on the complexity of a scene. And because animations are a compilation of several thousand still images, the time required to produce an animation increases exponentially. With a game engine, all of this information is calculated much differently and allows everything to be displayed in real-time. These engines are also referred to as “what you see is what you get," or WYSIWYG (pronounced wiz-zee-wig). The time-consuming process of rendering is eliminated, and the finished product appears as shown in the WYSIWYG.

We have been exploring the use of game engines in architecture and have developed the means and methods of importing a BIM or SketchUp model directly into a game engine. This gives us the ability to offer real-time walkthroughs of architectural designs, putting clients in control of the experience. This leads to a more engaging design process.

The use of game-engines in architectural visualization will also allow our designers to create truly immersive environments. A static building can come to life by populating it with moving people, natural landscapes, and artificial climates.

This technology is evolving rapidly, and soon game engines will be capable of creating photorealistic environments in real-time. On the hardware side, next-gen virtual reality (VR) is becoming... a reality. Facebook recently acquired VR headset maker Oculus Rift. The Rift uses custom tracking technology, allowing you to seamlessly look around the virtual world just as you would in real life. Every subtle movement of your head is tracked in real-time, creating a natural and intuitive experience.

Man wears Oculus Rift. Photo courtesy of Sergey Galyonkin/Flickr.

Oculus Rift photos courtesy of Sergey Galyonkin/Flickr.

Just imagine being able to virtually explore a project early in the design phase while experiencing the elements of light, shadow, and scale. I see this happening in the next five years, but clients don't have to wait. Until then, we have the capability to give our clients control and let them view projects like never before.