REAL-TIME LIVE!

  • Full Conference Pass (FC) Full Conference Pass (FC)
  • Full Conference One-Day Pass (1D) Full Conference One-Day Pass (1D)
  • Basic Conference Pass (BC) Basic Conference Pass (BC)
  • Student One-Day Pass (SP) Student One-Day Pass (SP)

Date: Friday, December 7th
Time: 4:00pm - 6:00pm
Venue: Hall C (4F, C Block)
Session Chair(s): Isamu Hasegawa, Luminous Productions Co., Ltd.,


MR360 Live: Immersive Mixed Reality with Live 360° Video

Abstract: DreamFlux presents MR360 Live, a new way to create immersive and interactive Mixed Reality applications. It blends 3D virtual objects into live streamed 360 videos in real-time, providing the illusion of interacting with objects in the video. From a standard 360° video, we automatically extract important lighting details, illuminate virtual objects, and realistically composite them into the video. Our MR360 toolkit runs in real-time and is integrated into game engines, enabling content creators to conveniently build interactive mixed reality applications. We demonstrate applications for augmented teleportation using 360° videos. This application allows VR users to travel to different 360° videos. The user can add and interact with digital objects in the video to create their own augmented/mixed reality world. Using a live streaming 360° camera, we will travel to and augment the Real-Time Live! stage in front of a live audience.

Authors/Presenter(s): Taehyun Rhee, Computational Media Innovation Centre (CMIC), Victoria University of Wellington; DreamFlux, New Zealand
Ian Loh, Computational Media Innovation Centre (CMIC), Victoria University of Wellington; DreamFlux, New Zealand
Lohit Petikam, Computational Media Innovation Centre (CMIC), Victoria University of Wellington; DreamFlux, New Zealand
Ben Allen, Computational Media Innovation Centre (CMIC), Victoria University of Wellington; DreamFlux, New Zealand
Andrew Chalmers, Computational Media Innovation Centre (CMIC), Victoria University of Wellington; DreamFlux, New Zealand


An Architecture for Immersive Interactions with an Emotional Character AI in VR

Abstract: When entering a virtual world, the users expect an experience that feels natural. Huge progress has been made with regards to motion, vision, physical interactivity, whereas interactivity with non-playable character stays behind. This live-demo introduces a method that leads to more aware, expressive and lively agents that can answer their own needs and interact with the player. Notably, the live-demo covers the use of an emotional component and the addition of a layer of communication (speech) to allow more immersive and interactive AIs in VR.

Authors/Presenter(s): Gautier Boeda, Square Enix Co., Ltd., Japan
Yuta Mizuno, Square Enix Co., Ltd.


Live Replay Movie Creation of Gran Turismo

Abstract: We perform live replay movie creation using the recorded play data of Gran Turismo. Our real-time technologies enable movie editing such as authoring camerawork and adding visual effects while reproducing the race scene with high quality graphics from the play data. We also demonstrate some recent development for the future.

Authors/Presenter(s): Masamichi Sugihara, Polyphony Digital Inc., Japan
Hiroki Kashiwagi, Polyphony Digital Inc., Japan
Tatsuya Matsue, Polyphony Digital Inc., Japan


Pinscreen Avatars in your Pocket: Mobile paGAN engine and Personalized Gaming

Abstract: We will demonstrate how a lifelike 3D avatar can be instantly built from a single selfie input image using our own team members as well as a volunteer from the audience. We will showcase some additional 3D avatars built from internet photographs, and highlight the underlying technology such as our light-weight real-time facial tracking system. Then we will show how our automated rigging system enables facial performance capture as well as full body integration. We will showcase different body customization features and other digital assets, and show various immersive applications such as 3D selfie themes, multi-player games, all running on an iPhone.

Authors/Presenter(s): Koki Nagano, Pinscreen, USC Institute for Creative Technologies, United States of America
Liwen Hu, Pinscreen, United States of America
Lain Goldwhite, Pinscreen, United States of America


“REALITY: Be yourself you want to be” VTuber and presence technologies in live entertainment which can make interact between smartphone and virtual live characters

Abstract: In this presentation we proposes and demonstrate the “REALITY" platform and associated software and hardware components to answer needs in the live entertainment sector and the Virtual Youtuber space. We demonstrate how off the shelf software and hardware components can be combined together to realize a dream in the animation of interactive virtual characters. This presentation is a collaboration between GREE, Inc., Wright Flyer Live Entertainment, Inc. (WFLE), IKINEMA and StretchSense. "REALITY" is a platform for VTubers and in this presentation we promote the concept of "Be yourself you want to be" with associated services like live entertainment broadcasting service and smartphone applications. The animation industry in Japan is mature, with numerous animation titles released and with a vast number of animation fans. The fans' way of enjoying is not limited to just watching animation and purchasing related goods, many people have the desire to become a "virtual hero" at events or at different platforms in Japan. Virtual character culture, VOCALOID, contents such as videos and live streaming are already present, and we believe that the virtual talent in 2D or 3D avatar is now easily accepted. Furthermore, we demonstrate that the concept can be now achieved with readily available and affordable solutions and setups from home. In addition, unlike animation, interactive bi-directional communication is possible as virtual talent responds to comments sent by audiences during live streaming and in turn Twitter accounts are updated frequently to respond to this. Fans feel fully engaged and fully interactive with their virtual idols. Interactive virtual characters that can be setup together by current consumer VR technologies, allow one to express much more openly via a virtual avatar without revealing appearances. The proposed REALITY platform allows easy creation and animation of different avatars and a opportunity for one to have multiple 3D avatar to represent personality and represent themselves in the virtual space. This clearly makes this new opportunity very appealing for a large audience. This presentation demonstrates the full concept and brings the features of the REALITY platform for VTubers and interactive virtual characters in general.

Authors/Presenter(s): Akihiro Shirai, GREE, Inc., Japan
Masashi Watanabe, Wright Flyer Live Entertainment, Inc., Japan
Todd Gisby, StretchSense Limited, New Zealand
Atbin Ebrahimpour, IKINEMA, United Kingdom
Alexandre Pechev, IKINEMA, United Kingdom
Eiji Araki, Wright Flyer Live Entertainment, Inc., Japan


More Real, Less Time: Mimic's Quest in Real-Time Facial Animation

Abstract: Mimic Productions' CEO, Hermione Mitford, will present a live-stream demonstration of detailed facial animation in real-time, utilizing her photo-real 3D digital-double. The presentation will include a speech from Mitford (and her avatar) addressing Mimic's technological approach, as well as the corresponding applications for the technology. A specific focus will be placed on realism and the details of the human face.

Authors/Presenter(s): Hermione Mitford, Mimic Productions, Germany


The Power of Real-Time Collaborative Filmmaking 2

Abstract: We propose two people on stage with two more persons located in Paris. We will then demonstrate remote work with real-time feedback. Each of us will be responsible for creating a short sequence. The goal is to show that with a real-time unified collaborative workflow (CUP® workflow) we can produce a rather complex movie in just a few minutes of work. We will demonstrate three key features of our solution on stage live. First the real-time collaborative editing and unified workflow: we create a 3D animation movie in a single application and modify each other’s work with instant feedback. Then we will load the created scenes on a smartphone that wewill be used as a virtual camera (with the help of augmented reality). Finally, we will show a version of the movie rendered with RTX technology live in real-time on the server.

Authors/Presenter(s): Jean-Colas Prunier, PocketStudio, France
Stephane Tayeb, PocketStudio, France


Real-time character animation of BanaCAST (BANDAI NAMCO Studios Inc.)

Abstract: High quality real-time CG character animation by BANDAI NAMCO Studios Inc. combining latest motion capture technology and game engine

Authors/Presenter(s): Naohiko Morimoto, Bandai Namco Studios Inc., Japan
Jun Ohsone, Bandai Namco Studios Inc., Japan
Shoko Doi, Bandai Namco Studios Inc., Japan


 

Back

/en/attendees/virtual-augmented-reality-vr-ar /en/attendees/technical-briefs