I have been fascinated with virtual characters ever since reading science-fiction novels that featured sentient personas that provided guidance to captains of starships on intergalactic missions. Those concepts migrated to television and cinema, from Star Trek’s holodeck simulations in the ’60s, to Max Headroom in the ’80s, to the holographic virtual companion Joi in the film Blade Runner 2049.

Although the idea of virtual humans has been around in pop culture, anime, and science fiction, the Japanese brought the concept of “virtual idols” to reality when the Horipro talent agency launched Kyoko Date in 1996, and Ken-ichi Kutsugi designed and created Terai Yuki in 1997. Both became CG pop stars and digital personalities who released CDs that garnered some airplay. They were covered by the media, made television and radio appearances and music videos, and were even featured in publications and ads. Neither was able to create enough interesting follow-up content to sustain their initial adulation or enjoy more than limited success.

Eventually, virtual idols waned in popularity, but in 2007, a turquoise-haired virtual idol named Hatsune Miku started the resurgence and rocketed to stardom in Japan. Crypton Future Media created the character using the Vocaloid engine, a synthesized voice technology developed by Yamaha that was able to autotune notes in a human-like manner based on user input. This gave Hatsune Miku a unique voice and personality that enabled users to create original music and dialogue for the character. She became a digital phenom due to the collaborative nature of her multimedia content, which was openly shared on Nico Nico Douga, a popular Japanese consumer-streaming platform where fans were encouraged to write original music and remixes, create and edit videos, and illustrate and animate 2D and 3D renditions of her.

Hatsune Miku performs live as an animated holographic singer at concerts around the world. Her fanbase numbers in the millions, and she has hundreds of millions of views and listens across the internet. She has appeared in TV commercials, TV shows, animations, books, and magazines, as well as on billboards, promotional cars, and online soundtracks. She even has a full line of action figures.

Due to the popularity of Hatsune Miku and low-cost hardware and software, the proliferation of “Virtual YouTubers” has been increasing in Japan. Virtual YouTubers are animated characters generated in real time using reasonably priced motion capture software, motion capture hardware, and open source animation software MikuMikuDance (MMD). One popular Virtual YouTuber, Kizuna Ai, has her own YouTube channel, A.I., which has more than 1.5 million subscribers and is rapidly growing. Ai creates Let’s Play videos, instructs viewers in how-to videos, and vlogs about various subjects on her channel. She faces increasing competition as more and more Virtual Idols are being released on a consistent basis.

The technologies and tools for animating CG characters in real-time are now more accessible to companies, educators, and content creators. Mostly, they require either low-cost facial tracking software or a gaming engine such as Unreal or Unity installed on mid- to upper-level consumer laptops with powerful GPUs. A new generation of mobile apps with facial and body tracking technology will take advantage of ARKit and ARCore, making it easier to animate characters from mobile phones.

Since the technology is becoming simpler and more cost-effective, computer-generated personalities and avatars will become more prevalent in the virtual space and online for live events, corporate communications, customer support, and training, and avatars will be integrated into more presentations. The level of sophistication of produced streams will feature dynamic data displays, virtual sets, animations, interactive objects and environments, live feeds, social media, motion graphics, and more.

Content producers will find new ways of using these tools to improve on how people interact with their clients to achieve their creative and business objectives. Production for cinema, television, gaming, and live streaming will be merged into one common pipeline.

In their current state, holograms and virtual characters are puppeted by actors using head cameras and mocap suits. Eventually, characters will be driven by AI, which will have a profound effect on enterprise media and improve viewer satisfaction. It will also enable animated characters to speak directly with their viewers, eventually giving them increasingly human-like personalities that viewers can relate to and connect with on more a hyper-personalized level.

[This article appears in the April/May 2018 issue of Streaming Media magazine as “Real-Time Virtual Characters and Idols Will Create New Forms of Streaming Content.”]