Principal Researcher and Manager of the Computer Graphics Group, Microsoft Research
Title:Automating image and video morphing
Abstract: Creating transitions between images of different objects is a classical problem in computer graphics. The main challenge is to establish a map that aligns corresponding image elements. In this talk I’ll present techniques to help automate this often tedious task. The main idea is to optimize the alignment of structurally compatible image neighborhoods. We formulate this optimization over a halfway parametric domain so that regions can either grow or shrink without introducing mapping discontinuities. The halfway domain also enables direct evaluation of the morph in a pixel shader without mesh rasterization. The algorithm is parallelized on a GPU to achieve a responsive interface. Finally, I’ll show several extensions of the method, including the creation of morphs between videos -- all with little user effort.
Bio: Hugues Hoppe is a principal researcher and manager of the Computer Graphics Group at Microsoft Research. His main interests lie in the multiresolution representation, parameterization, and synthesis of geometry, images, and video. He received the 2004 ACM SIGGRAPH Computer Graphics Achievement Award for pioneering work on surface reconstruction, progressive meshes, geometry texturing, and geometry images. He has published many papers at ACM SIGGRAPH and Transactions on Graphics. Contributions at Microsoft include mesh simplification and optimization in DirectX, texture synthesis technology, motion recognition in Kinect Star Wars, and seamless stitching of the terapixel sky in WorldWide Telescope. He is an ACM Fellow, served as editor-in-chief of ACM TOG, and was papers chair for SIGGRAPH 2011.
Graphics Engineering Architect, Bungie
Title:Applied graphics research for video games: solving real-world problems under real-world constraints (Lessons from Destiny's development)
Abstract: In video games of today, real-time graphics are often a crucial aspect of engaging the player. Detailed and beautifully rendered worlds help create immersive environments for the player to participate in. As the game developers continue to improve the real-time rendering techniques, extracting the most out of the current set of console capabilities, to give the players the best experience possible, video game developers benefit by leaning strongly on applied graphics research to help this process. This talk will cover the practical considerations faced by video game developers, with respect to hardware constraints, production considerations, including budget, resource and time constraints.
In this talk, we will focus on how graphics research can be applied to the real-world constraints of video games, drawing on the lessons of solving real-world problems under real-world constraints and analyzing patterns learned throughout Destiny's development. We will cover several examples of Destiny's applied research and their integration into the shipping games engine. We will discuss examples of techniques that were successfully integrated into the shipping game, as well as analyze additional originally promising methods that ultimately failed to work under the conditions of real hardware constraints or the content creation needs of a shipping title. We will cover in depth what it means to ship a cross-platform, scalable rendering engine with today's rendering expectations, and the corresponding important considerations for applied real-time research. Real-world hardware constraints with respect to graphics techniques will be analyzed. We will discuss the content creation implications that are important for graphics algorithms as well as current trends for game worlds creation. Lastly, we will discuss some of the remaining challenges faced by the game developers going forward in hopes of spurring more applied research for that domain.
Bio: Natalya Tatarchuk is a Graphics Engineering Architect working on state-of-the art cross-platform next-gen rendering engine and game graphics for the Bungie's latest video game Destiny. Previously she was a graphics software architect and a project lead in the Game Computing Application Group at AMD Graphics Products Group (Office of the CTO) where she pushed parallel computing boundaries investigating innovative real-time graphics techniques. Additionally, she had been the lead of ATI’s demo team creating the innovative interactive renderings and the lead for the tools group at ATI Research. She has published papers and articles in various computer graphics conferences and technical book series, and has presented her work at graphics and game developer conferences worldwide
Research Lead, NVIDIA
Title:Physics in Games
Abstract: Physical simulations have a long history in engineering and have been successfully used to complement real world experiments. Computer simulations are used to study extreme conditions and very small time intervals. As a consequence, the accuracy of the models used and their results are central to engineering applications. For more than three decades, physical simulations have also been used in computer graphics in order to increase the realism of animations and to free artists from animating secondary motion by hand. Here, accuracy is only important to the extent that plausible behavior is generated. There are, however, additional requirements not present in engineering such as controllability: movie directors and game developers want to be able to control the collapse of a building or the path taken by a flood wave. Another aspect that is central to games is stability. In the film industry, if a simulation becomes unstable or does not yield the desired result, the artist simply re-runs the shot. This is not possible in games because the way in which the player will interact with a scene is not known in advance. In my talk I will present a variety of simulation methods we have developed that are stable enough for games, while still producing plausible physical behavior. Examples are approaches to simulate soft bodies, clothing, destruction, liquids and a unified solver combining all these effects.
Bio: Matthias Müller is Research Lead of the PhysX SDK team at NVIDIA. PhysX is a GPU accelerated physically based simulation engine for computer games. His research interests include the development of methods for the simulation of rigid bodies, fracture, soft bodies, cloth and fluids that are fast, controllable and robust enough to be used in game environments. He is a pioneer in the field of position based dynamics and has been contributing to this and other fields via numerous publications in the major computer graphics conferences and journals. Position based dynamics has become the standard for the simulation of soft bodies and cloth in computer games and has been adopted by the film industry as well.
Matthias Müller received his Ph.D. from ETH Zürich for his work on the atomistic simulation of dense polymer systems. During a two year post-doc with the computer graphics group at MIT he changed his research focus from atomistic offline simulations to macroscopic real time simulation in computer graphics. In 2002 he co-founded Novodex, a company that developed a simulation engine for computer games. In 2004 Novodex was acquired by AGEIA which, in turn, was acquired by NVIDIA in 2008.