DICE 2012: Epic’s Tim Sweeney Talks Mad Science
I’d always considered Epic Games Co-Founder Tim Sweeney to be the perfect analogue of id Software’s John Carmack, and hearing him speak in person, I feel more convinced of that than ever. Sweeney’s world is so saturated with technical knowledge and expertise that he can see the future of technology as a curious eventuality rather than intriguing mystery.
During his talk titled “Technology and Gaming in the Next 20 Years” at DICE 2012, Sweeney shared his unique perspective on technology in regards to gaming, and dissected where it can and might go. First, he addressed the notion that game processing and graphics really have no room to grow, and surprisingly there’s some truth to that.
The human mind can only perceive and process so much data at once — your eye can only process an image of around 30 megapixels at a limit of 72 frames per second. Rendering scenes at a higher resolution or a higher frame rate than that is essentially wasted data. This means that there is, objectively speaking, a cap on how good graphics can get in traditional terms. Of course discounting artistic flair, a photoreal scene rendered at 72 FPS is unbeatable.
Of course, we don’t have the computational power to push those numbers right now, but according to Moore’s Law (the principle predicting that computational power doubles roughly every two years), we’re about two hardware generations away from pushing that data. That means that there’s still plenty of technical room left to grow, but hitting the limits of human perception is a very real prospect.
This only applies to “known” properties though. Visual elements like light, color, smoke, and water are algorithmic; mathematical formulas can be used to properly simulate them 100 percent of the time. However, the “unknowns” in games — thought, movement, speech, AI — can never be perfectly simulated and thus can’t be bound by imitation of reality. While it’s possible to render a scene that is visually indistinguishable from reality, the human intellect (and the recreation thereof) is much more difficult to realistically simulate.
There’s another looming bound on computing power as well. Transistors are continually shrinking in size, but we’re rapidly approaching transistors the size of a single atom. Physically speaking, we can’t get any smaller than that without resorting to quantum computing. Sweeney suggested a potential stopgap is to stack transistors – building a transistor sheet that is three dimensional would allow much more computational power to be crammed into a physical space.
Neglecting that, there’s yet another upper bound on computational power. The Bekenstein Bound essentially states that there will always be a physical limit on the amount of information or processing that can occur in a given physical space because of the heat it produces. It might sound obvious, but you can’t pull infinite computational power from a computer that occupies a finite amount of space. Again referring to Moore’s Law, that eventuality is about 200 years away.
From there, Sweeney took a step back to examine the human side of technology. Despite what engineers would love to believe, the adoption and progression of technology isn’t tied strictly to Moore’s Law. Social acceptance is an incredibly important factor to pushing technology forward, and that happens without an identifiable catalyst. For instance, by technology standards, Facebook could have existed in 1994, but it didn’t. It takes vision and originality to get these leaps of technology.
With that in mind, there’s no way to say exactly where technology will go, but Sweeney did identify a few interesting directions that may lead to the next socio-technological revelation on the order of the iPhone. He pointed to the connectivity and geo-location of phones, in addition to their augmented reality features as a potential field of development. The Kinect and Siri also surfaced as technologies that are finally practical enough to garner social acceptance, though motion tracking and voice recognition have existed in inferior forms for years.
He also seemed particularly captivated with the idea of virtual goods. As the Earth’s resources are increasingly consumed, goods that consume fewer resources to produce will become a more valued commodity. Sweeney even theorized that the digital goods market could equal or eclipse the world’s real estate market within a few decades, which is incredible.
Ultimately, Sweeney felt that there’s plenty of room to grow, though his earlier comments indicate that that growth won’t happen as traditionally as it has before. Reading between the lines a little, it sounds like Sweeney expects a future where improved technology will change how we interact with games, rather than how they look through traditional delivery methods.
Personally, as someone who likes pretty lights and sparkly effects, it’s a little surprising to hear that we’re relatively close to the limit of graphics processing, but I suppose there’s no one to blame for that but our own physiology.9