Virtual SIGGRAPH 2021 Still a Feast for the Eyes and Mind

Article By : Kevin Krewell

Nvidia's SIGGRAPH demo of its metaverse for industrial design included a surprise — that wasn't CEO Jensen Huang, it was his digital twin.

As with many conferences, this year’s edition of SIGGRAPH was held as a virtual conference, for the second year in a row. This year it was a significantly smoother operation, and it had an excellent platform for discovering and watching content all integrated into one platform. And considering that SIGGRAPH is a conference about computer graphics and related topics, it’s a conference that is relatively amenable to virtual attendance. And as an academic conference much of the material is keynotes and paper presentation sessions.

Much of the SIGGRAPH content is contributed by university and industry researchers. While virtual is fine for paper presentations and tutorials, some of that research and the art installations really need to be seen and felt in person (like haptics) and is not as effective over the Internet. Transitioning a attendee favorite event, Real-Time Live!, was a challenge. In the event, people present interactive projects live. In the past it was one stage, but now it was streamed from multiple locations to the SIGGRAPH platform.

A reverse pass-through VR headset from Facebook Reality Labs displays the wearer’s eyes on the outside of the head-mounted display.

This was the 48th annual SIGGRAPH conference and as the 50th anniversary is approaching, the conference organizers did not want to wait to the last moment to plan for the milestone (www.siggraph50.world). There were several retrospective sessions with luminaries of the industry. There was a retrospective session on Silicon Graphics (SGI), one of the breakthrough graphics and workstation companies. There was also a conversation with Ed Catmull and Pat Hanrahan, who were honored with the 2019 Turing Award for their role in advancing the field of computer graphics, talking about how they got started, and how they helped shape the field by integrating advancements in computer science, technological development, and cultural change. In the fireside chat, Catmull and Hanrahan shared their perspectives and remembrances on the events that advanced computer graphics.

The conference is always looking for interesting projects that mix science and computing. The session called “Landing on Mars With Your Eyes Open” was a talk from NASA/JPL’s Rob Donnelly on the use of Terrain Relative Navigation, or TRN, to land the Mars 2020 Perseverance rover on the surface of Mars at Jezero Crater, the most hazardous Mars landing site to date. The TRN technology uses computer vision to determine the lander’s position using satellite imagery of Mars and decides on a safe landing site. All of this was done autonomously, without control from Earth. Using TRN allows the rover to land closer to scientifically valuable sites than ever before. The TRN used two space-hardened FPGAs and PowerPC processor.

A very popular feature in SIGGRAPH has always been the making of Hollywood movies and well-known video games. This year’s conference included sessions on the making of the special effects for Godzilla vs. Kong by Weta Digital and video features from Marvel Studios’ WandaVisionThe Falcon and The Winter Soldier, and Loki. Also, featured was the Tom Hanks World War 2 period drama Greyhound, which blended a real berthed WWII destroyer (U.S.S. Kidd) with realistic ocean wave effects for the movie. Each of these movies and video series required extensive computer graphics and/or animation to attain the artistic visual goal. But it wasn’t all about the CGI, the session on the making of Disney’s animated movie Soul focused on character development and cultural authenticity.

This year lead sponsors included Intel, which talked about ray tracing (coming soon in its Xe GPUs); Nvidia, which made numerous announcements; and Unity, which had sessions on real-time rendering for movie and design applications.  Nvidia introduced some real hardware at SIGGRAPH: the RTX A2000, which brings the company’s RTX architecture to a smaller low-profile, dual-slot GPU design. The A2000 delivers real-time ray tracing, AI-accelerated compute, and high-performance graphics to a slim desktop, including small form factor (SFF) workstations.

But what was a little bit of a surprise this year was Nvidia’s demonstration of how the company applied its Omniverse (a metaverse for industrial design, technology for building digital twins) to its own GTC keynote earlier this year. A 30-minute video showing the making of the GTC keynote includes how Nvidia created a complete digital twin of CEO Jensen Huang’s kitchen, which he had been using for various keynotes since the pandemic hit. The team modeled the entire kitchen — including everything that wasn’t even visible — so that at one point they could completely disassemble (and reassemble) the kitchen virtually. And then they created a digital twin of Jensen Huang himself, by combining multiple scans with body model modeling and using an actor to recreate body movements for the virtual Jensen. The virtual Jensen shows up roughly an hour into the 2021 GTC keynote.

(The video is set to start a moment before Huang’s digital twin is introduced).

One of the more unusual research projects came from Facebook Reality Labs. The video shows the creation of a reverse pass-through VR headset that presents a 3D image of the wearer’s eyes and displays them on the outside of the Virtual Reality (VR) head-mounted display (HMD). With a perspective-correct light field display, allowing wearer gaze-cues to support social co-presence interactions. The goal is to create a more natural interaction between the wearer of an HMD and the people around them.

The displays are thin, microlens arrays for both the VR images and the virtual wearer’s eyes. The wearer’s eyes are modeled in 3D software and gaze is captured by an IR camera inside the headset and then textured and colorized to look natural. While it still looks a bit creepy, this research is still in its early stages and it’s better than putting googly eyes on the outside of your VR HMD.

There are hundreds of videos to view of the conference and on demand content will be available through 29 October 2021. Even if you missed the show, you could still register and catch up on some of the latest graphics technologies and innovative projects using the latest technology.

This article was originally published on EE Times.

Kevin Krewell is Principal Analyst at Tirias Research. Before joining Tirias Research, he was a Senior Analyst at The Linley Group and a Senior Editor of Microprocessor Report. He spent nine years at MPR in a variety of roles, contributing numerous articles on mobile SoCs, PC processors, graphics processors, server processors, CPU IP cores, and related technology. For The Linley Group, he co-authored reports that analyzed market positioning and technical features of the various vendor products. He has more than 25 years of industry experience in both engineering and marketing positions. Before joining The Linley Group, Kevin was Director of Strategic Marketing at Nvidia and Director of Technical Marketing at Raza Microelectronics (now part of Broadcom). He spent more than a decade at AMD in various roles, including technical marketing manager and field application engineer. He also understands the needs of engineers, having spent 10 years in product design at several smaller companies. He earned a BS in electrical engineering from Manhattan College. He also holds an MBA from Adelphi University and is a member of the IEEE as well as a member of the Microprocessor Oral History SIG for the Computer History Museum.

 

Leave a comment