Real-Time Live!

  • Full Conference Pass (FC) Full Conference Pass (FC)
  • Full Conference One-Day Pass (1D) Full Conference One-Day Pass (1D)
  • Basic Conference Pass (BC) Basic Conference Pass (BC)
  • Student One-Day Pass (SP) Student One-Day Pass (SP)

It’s the future of interactive techniques, live on stage! Watch the most innovative interactive techniques as they are presented and deconstructed live by their creators during SIGGRAPH Asia’s Real-Time Live! experience.

Real-Time Live! is one of the experiences that makes it unique and a must-attend event. Watch as the best of the best in real-time graphics and interactivity come together for a live competition and share their innovations. From mobile games to console games to virtual and augmented reality, creators of cutting-edge real-time graphics give you a look under the hood of their creations and share the techniques they use to achieve jaw-dropping results.

 

Real-Time Live! Chair

このメールアドレスはスパムボットから保護されています。閲覧するにはJavaScriptを有効にする必要があります。
Luminous Productions Co., Ltd., Japan

 

Sponsored By:

 

Date: Friday, December 7th
Time: 4:00pm - 6:00pm
Venue: Hall C (4F, C Block)


MR360 Live: Immersive Mixed Reality with Live 360° Video

Abstract: DreamFlux presents MR360 Live, a new way to create immersive and interactive Mixed Reality applications. It blends 3D virtual objects into live streamed 360 videos in real-time, providing the illusion of interacting with objects in the video. From a standard 360° video, we automatically extract important lighting details, illuminate virtual objects, and realistically composite them into the video. Our MR360 toolkit runs in real-time and is integrated into game engines, enabling content creators to conveniently build interactive mixed reality applications. We demonstrate applications for augmented teleportation using 360° videos. This application allows VR users to travel to different 360° videos. The user can add and interact with digital objects in the video to create their own augmented/mixed reality world. Using a live streaming 360° camera, we will travel to and augment the Real-Time Live! stage in front of a live audience.

Authors/Presenter(s): Taehyun Rhee, CMIC, Victoria University of Wellington; DreamFlux, New Zealand
Ian Loh, CMIC, Victoria University of Wellington; DreamFlux, New Zealand
Lohit Petikam, CMIC, Victoria University of Wellington; DreamFlux, New Zealand
Ben Allen, CMIC, Victoria University of Wellington; DreamFlux, New Zealand
Andrew Chalmers, CMIC, Victoria University of Wellington; DreamFlux, New Zealand


An Architecture for Immersive Interactions with an Emotional Character AI in VR

Abstract: When entering a virtual world, the users expect an experience that feels natural. Huge progress has been made with regards to motion, vision, physical interactivity, whereas interactivity with non-playable character stays behind. This live-demo introduces a method that leads to more aware, expressive and lively agents that can answer their own needs and interact with the player. Notably, the live-demo covers the use of an emotional component and the addition of a layer of communication (speech) to allow more immersive and interactive AIs in VR.

Authors/Presenter(s): Gautier Boeda, SQUARE ENIX CO., LTD., Japan


Live Replay Movie Creation of Gran Turismo

Abstract: We perform live replay movie creation using the recorded play data of Gran Turismo. Our real-time technologies enable movie editing such as authoring camerawork and adding visual effects while reproducing the race scene with high quality graphics from the play data. We also demonstrate some recent development for the future.

Authors/Presenter(s): Masamichi Sugihara, Polyphony Digital Inc., Japan
Hiroki Kashiwagi, Polyphony Digital Inc., Japan
Tatsuya Matsue, Polyphony Digital Inc., Japan


Pinscreen Avatars in your Pocket: Mobile paGAN engine and Personalized Gaming

Abstract: We will demonstrate how a lifelike 3D avatar can be instantly built from a single selfie input image using our own team members as well as a volunteer from the audience. We will showcase some additional 3D avatars built from internet photographs, and highlight the underlying technology such as our light-weight real-time facial tracking system. Then we will show how our automated rigging system enables facial performance capture as well as full body integration. We will showcase different body customization features and other digital assets, and show various immersive applications such as 3D selfie themes, multi-player games, all running on an iPhone.

Authors/Presenter(s): Koki Nagano, Pinscreen, USC Institute for Creative Technologies, United States of America
Liwen Hu, Pinscreen, United States of America
Lain Goldwhite, Pinscreen, United States of America


“REALITY: Be yourself you want to be” VTuber and presence technologies in live entertainment which can make interact between smartphone and virtual live characters

Abstract: In this presentation we proposes and demonstrate the “REALITY" platform and associated software and hardware components to answer needs in the live entertainment sector and the Virtual Youtuber space. We demonstrate how off the shelf software and hardware components can be combined together to realize a dream in the animation of interactive virtual characters. This presentation is a collaboration between GREE, Inc., Wright Flyer Live Entertainment, Inc. (WFLE), IKINEMA and StretchSense. "REALITY" is a platform for VTubers and in this presentation we promote the concept of "Be yourself you want to be" with associated services like live entertainment broadcasting service and smartphone applications. The animation industry in Japan is mature, with numerous animation titles released and with a vast number of animation fans. The fans' way of enjoying is not limited to just watching animation and purchasing related goods, many people have the desire to become a "virtual hero" at events or at different platforms in Japan. Virtual character culture, VOCALOID, contents such as videos and live streaming are already present, and we believe that the virtual talent in 2D or 3D avatar is now easily accepted. Furthermore, we demonstrate that the concept can be now achieved with readily available and affordable solutions and setups from home. In addition, unlike animation, interactive bi-directional communication is possible as virtual talent responds to comments sent by audiences during live streaming and in turn Twitter accounts are updated frequently to respond to this. Fans feel fully engaged and fully interactive with their virtual idols. Interactive virtual characters that can be setup together by current consumer VR technologies, allow one to express much more openly via a virtual avatar without revealing appearances. The proposed REALITY platform allows easy creation and animation of different avatars and a opportunity for one to have multiple 3D avatar to represent personality and represent themselves in the virtual space. This clearly makes this new opportunity very appealing for a large audience. This presentation demonstrates the full concept and brings the features of the REALITY platform for VTubers and interactive virtual characters in general.

Authors/Presenter(s): Akihiro Shirai, GREE, Inc., Japan
Masashi Watanabe, Wright Flyer Live Entertainment Inc, Japan
Todd Gisby, StretchSense Limited, New Zealand
Atbin Ebrahimpour, IKINEMA, United Kingdom
Alexandre Pechev, IKINEMA, United Kingdom
Eiji Araki, Wright Flyer Live Entertainment, Inc., Japan


More Real, Less Time: Mimic's Quest in Real-Time Facial Animation

Abstract: Mimic Productions' CEO, Hermione Mitford, will present a live-stream demonstration of detailed facial animation in real-time, utilizing her photo-real 3D digital-double. The presentation will include a speech from Mitford (and her avatar) addressing Mimic's technological approach, as well as the corresponding applications for the technology. A specific focus will be placed on realism and the details of the human face.

Authors/Presenter(s): Hermione Mitford, Mimic Productions, Germany


The Power of Real-Time Collaborative Filmmaking 2

Abstract: We propose two people on stage with two more persons located in Paris. We will then demonstrate remote work with real-time feedback. Each of us would be responsible for creating a short sequence. The goal is to show that in just a few minutes of work we can produce a rather larger movie with our approach. We would also demonstrate the other 3 main features of our solution on stage live: - The real-time collaborative editing and unified workflow: we create a 3D animation movie in a single application and modify each other’s work with instant feedback - We would load the created scenes on a smartphone that would be used as a virtual camera (with the help of AR). The created camera would then be used in the cut. - We would demonstrate how the movie can be rendered in the cloud and steamed lives (in real- time) to a tablet. If we have time to set that up, people will be able to watch the stream live on their phones.

Authors/Presenter(s): Jean-Colas Prunier, PocketStudio, France
Stephanie Tayeb, PocketStudio, France


/jp/attendees/technical-briefs /jp/attendees/production-gallery