"I will develop it into Korea's 'Disney Research'" ...Lee Kwang-hee, Head of AI Research at Vive Studios
Lee Kwang-hee, Head of AI Research at Vive Studios (Photo: Vive Studios)
"Vive Studios is similar to Disney Studios in that we own creators, planners, directors, and IPs like ‘Jilju’. Just as Disney Research supports the dazzling visuals behind Disney films, ViveLab at Vive Studios will serve the same role."
Lee Kwang-hee, Head of the AI Research Lab at Vive Studios, announced on the 10th his ambition to grow ViveLab into a technology organization akin to Disney Research.
He explained that ViveLab, the research organization of Vive Studios, aims to become a driving force that grows the company's proprietary IPs through various technologies. “Disney was able to create blockbuster hits with richly enhanced visual effects thanks to Disney Research’s technological innovation,” he said. “ViveLab plays the same role.”
Vive Studios, a virtual production specialist founded by CEO Kim Se-kyu, started as a video content production company in 2003 and gradually expanded its business into advertising, game cinematics, and more. It later moved into virtual reality (VR) content, releasing the VR film Vault: Chain City in 2017, which won awards at major U.S. film festivals and was invited to the Sundance Film Festival.
To integrate visual AI and extended reality (XR) technologies into its content production pipeline, Vive Studios launched its AI research lab, ViveLab, in January 2022.
Lee began his career at Samsung Medison, focusing on expanding computer vision and AI into various domains. After working in AI in the medical field, he served as Head of Image Generation at the Artificial Intelligence Research Institute (AIRI) and AI Technology Lead at Boeing Korea Research & Technology Center, before joining Vive Studios as CTO in December 2021.
He identified virtual production solutions and virtual human technology as the two core areas of ViveLab.
Virtual production replaces traditional green screen filming and post-production compositing with LED wall panels that display background imagery in real time during shooting. Using Unreal Engine, scenes resembling the final output can be planned and adjusted in real time from the pre-production stage.
This system allows for on-the-spot reviews and decisions during production, maximizing workflow efficiency. Vive Studios claims to be the first in Korea to implement this method.
The core technology enabling this is VIT, a solution that integrates key technologies into a unified platform. It allows synchronized control of LED displays, camera tracking, and real-time CG engines, and can be applied to films, music videos, live performances, and various other types of content.
Vive Studios collaborated with HYBE in 2022 using its VIT solution to produce an original story video. The company also participated in the production of a music video for Big Hit Music’s artist TXT in 2021, as well as the KBS documentary Kiss the Universe.
The VIT solution is scheduled for release as a B2B offering in October. Lee emphasized its significance, saying, “This will be the first time we unveil VIT to the world after using it exclusively for in-house content production.”
Regarding the virtual human business, he highlighted two key aspects. “Besides the technology itself, IP business is just as important,” he noted. “The reason Disney and Marvel hold such influence is because they own IPs with dedicated fandoms.”
As a result, Vive Studios has been steadily developing its own IPs—starting with Vault, which swept global VR film festivals, and including Jilju, a virtual human introduced last year—bringing the total to 21 proprietary IPs. Jilju gained attention after debuting on stage at the Melon Music Awards at the end of last year, and is now expanding her presence through platforms like Instagram.
“In 10 or 20 years, humanoid robots, which resemble humans, are expected to play various roles in our daily lives,” he predicted, adding, “The core technology that serves as the software of humanoids is precisely virtual humans.”
He particularly emphasized that ViveLab utilizes all three main virtual human implementation methods: 3D modeling, 2D AI face swapping, and 2D AI cloning. “Each domain is best suited to a different technology,” he explained. “Our intention is to organically combine them to enhance the completeness of virtual humans.”
Looking ahead, there are plans to equip virtual humans with conversational abilities through integration with language models like ChatGPT. Rather than adopting large language models (LLMs) developed by big tech companies, ViveLab has begun developing its own small language models (sLLMs). “Virtual humans only need to operate within specific domains,” he said. “LLMs are too large and costly to maintain. From a B2B perspective, this approach allows us to offer visual chatbot business solutions with our own technology.”
While many companies are currently focused on developing “conversational virtual humans,” ViveLab takes a different approach. “Instead of focusing on only one aspect, we are preparing for the future within the essence of video content production,” he said. “This means leveraging our technology in current business applications while also preparing for the future metaverse market.”
In fact, Vive Studios is co-producing the KBS 50th Anniversary historical drama The Goryeo-Khitan War, with ViveLab’s technologies being actively deployed in the production. Unlike other tech companies that build a technology and later look for a use case, ViveLab creates technologies tailored to already-defined domains. This, he explained, is what sets ViveLab apart.
Ultimately, he emphasized that the goal of ViveLab is to become a world-class research lab that continuously develops both content and production technology.
“Many AI developers work without having a clear domain of application. But without a domain, there’s little that technology alone can achieve. This creates real challenges,” he said. “On the other hand, our aim is to become a remarkable company like Disney Research, one that owns both IP and technology.”
He concluded, “The ideal company is one with a clear domain, where AI technology is integrated and domain experts collaborate closely with AI engineers.”
Lee Kwang-hee, Head of AI Research at Vive Studios (Photo: Vive Studios)
"Vive Studios is similar to Disney Studios in that we own creators, planners, directors, and IPs like ‘Jilju’. Just as Disney Research supports the dazzling visuals behind Disney films, ViveLab at Vive Studios will serve the same role."
Lee Kwang-hee, Head of the AI Research Lab at Vive Studios, announced on the 10th his ambition to grow ViveLab into a technology organization akin to Disney Research.
He explained that ViveLab, the research organization of Vive Studios, aims to become a driving force that grows the company's proprietary IPs through various technologies. “Disney was able to create blockbuster hits with richly enhanced visual effects thanks to Disney Research’s technological innovation,” he said. “ViveLab plays the same role.”
Vive Studios, a virtual production specialist founded by CEO Kim Se-kyu, started as a video content production company in 2003 and gradually expanded its business into advertising, game cinematics, and more. It later moved into virtual reality (VR) content, releasing the VR film Vault: Chain City in 2017, which won awards at major U.S. film festivals and was invited to the Sundance Film Festival.
To integrate visual AI and extended reality (XR) technologies into its content production pipeline, Vive Studios launched its AI research lab, ViveLab, in January 2022.
Lee began his career at Samsung Medison, focusing on expanding computer vision and AI into various domains. After working in AI in the medical field, he served as Head of Image Generation at the Artificial Intelligence Research Institute (AIRI) and AI Technology Lead at Boeing Korea Research & Technology Center, before joining Vive Studios as CTO in December 2021.
He identified virtual production solutions and virtual human technology as the two core areas of ViveLab.
Virtual production replaces traditional green screen filming and post-production compositing with LED wall panels that display background imagery in real time during shooting. Using Unreal Engine, scenes resembling the final output can be planned and adjusted in real time from the pre-production stage.
This system allows for on-the-spot reviews and decisions during production, maximizing workflow efficiency. Vive Studios claims to be the first in Korea to implement this method.
The core technology enabling this is VIT, a solution that integrates key technologies into a unified platform. It allows synchronized control of LED displays, camera tracking, and real-time CG engines, and can be applied to films, music videos, live performances, and various other types of content.
Vive Studios collaborated with HYBE in 2022 using its VIT solution to produce an original story video. The company also participated in the production of a music video for Big Hit Music’s artist TXT in 2021, as well as the KBS documentary Kiss the Universe.
The VIT solution is scheduled for release as a B2B offering in October. Lee emphasized its significance, saying, “This will be the first time we unveil VIT to the world after using it exclusively for in-house content production.”
Regarding the virtual human business, he highlighted two key aspects. “Besides the technology itself, IP business is just as important,” he noted. “The reason Disney and Marvel hold such influence is because they own IPs with dedicated fandoms.”
As a result, Vive Studios has been steadily developing its own IPs—starting with Vault, which swept global VR film festivals, and including Jilju, a virtual human introduced last year—bringing the total to 21 proprietary IPs. Jilju gained attention after debuting on stage at the Melon Music Awards at the end of last year, and is now expanding her presence through platforms like Instagram.
“In 10 or 20 years, humanoid robots, which resemble humans, are expected to play various roles in our daily lives,” he predicted, adding, “The core technology that serves as the software of humanoids is precisely virtual humans.”
He particularly emphasized that ViveLab utilizes all three main virtual human implementation methods: 3D modeling, 2D AI face swapping, and 2D AI cloning. “Each domain is best suited to a different technology,” he explained. “Our intention is to organically combine them to enhance the completeness of virtual humans.”
Looking ahead, there are plans to equip virtual humans with conversational abilities through integration with language models like ChatGPT. Rather than adopting large language models (LLMs) developed by big tech companies, ViveLab has begun developing its own small language models (sLLMs). “Virtual humans only need to operate within specific domains,” he said. “LLMs are too large and costly to maintain. From a B2B perspective, this approach allows us to offer visual chatbot business solutions with our own technology.”
While many companies are currently focused on developing “conversational virtual humans,” ViveLab takes a different approach. “Instead of focusing on only one aspect, we are preparing for the future within the essence of video content production,” he said. “This means leveraging our technology in current business applications while also preparing for the future metaverse market.”
In fact, Vive Studios is co-producing the KBS 50th Anniversary historical drama The Goryeo-Khitan War, with ViveLab’s technologies being actively deployed in the production. Unlike other tech companies that build a technology and later look for a use case, ViveLab creates technologies tailored to already-defined domains. This, he explained, is what sets ViveLab apart.
Ultimately, he emphasized that the goal of ViveLab is to become a world-class research lab that continuously develops both content and production technology.
“Many AI developers work without having a clear domain of application. But without a domain, there’s little that technology alone can achieve. This creates real challenges,” he said. “On the other hand, our aim is to become a remarkable company like Disney Research, one that owns both IP and technology.”
He concluded, “The ideal company is one with a clear domain, where AI technology is integrated and domain experts collaborate closely with AI engineers.”
LED Wall and Filming Equipment inside ViveLab
원문 기사 바로가기