Notes from Silicon Beach: AI and Hollywood – What is killing you will make you stronger

By Nicholas DeMartino ● August 08, 2018 16:30


“Hollywood and Silicon Valley are in the same business: producing algorithms,” writes artificial intelligence (AI) pioneer Yves Bergquist, one of a new breed of data scientists focused on the entertainment and media business. Scientists like Berquist believe that to survive and thrive, the media and entertainment industry needs to embrace cognitive science. That’s how they can hope to compete with tech companies and address their failing business models.

The cluster of technologies generally called artificial intelligence (AI) or machine learning (ML), including fields such as big data analytics, deep machine learning, semantics and natural language processing, visual and auditory recognition, prediction and personalization, and conversational agents, among others, enable the creation of software that can be taught to learn and program itself – to automate repetitive tasks and to provide insights that were never before possible.

Tech-assisted Content Development

One active area of AI in the industry is content development. For example, the studio-funded think tank, USC Entertainment Technology Center, where Bergquist leads an AI and neuroscience group, is mapping box office returns against elements of the film narrative. Bergquist is working on data breakdowns of movies, as shown in this demo, the work of two Bergquist AI startups, Corto and Novamente:

Another example is Greenlight Essentials, a member of IDEABOOST Network Connect. They have broken down decades of film screenplays into more than 40,000 unique plot elements, analyzing more than 200 million audience profiles to help filmmakers improve scripts, target audiences and improve marketing. Their product’s analytic terminal allows users with neither programming nor mathematics background to explore and discover repeatable patterns from decades of film data.

Scriptonomics is a ML application that breaks down movie scripts by scene, character, location and other components. Writers and producers can leverage insights and comparisons that the tool extracts from its massive database of past successful movies to improve subsequent drafts, as well as aid in making pitches and targeting audiences – as can be seen in this example of Scriptonomics breakdown for Titanic.

Founder Tammuz Dubnov says that Scriptonomics generates a geometric model of a screenplay – its DNA, if you will – to compare and improve elements when compared with financially successful films of the past. As discussed here, Dubnov believes that this data-driven, quantitative filmmaking process will give rise to a new generation of data-assistant content studios that will help create more hits and fewer flops.

RivetAI offers Agile Producer, a pre-production platform that automates script breakdown, storyboard, shot lists, scheduling and budgeting. Before RivetAI, Toronto native Debajyoti Ray built the earlier AI startup, Video AMP. This AI-powered video advertising solution helped him understand how much commercials owe to storytelling. So he decided to build an AI engine based on thousands of movie scripts, both produced and unproduced, which became RivetAI.

RivetAI was influenced by its production partner End Cue which had produced Sunspring, a short film starting Thomas Middleditch, and a script credited to “Benjamin, an artificially intelligent neural network." Also: “Bubbles,” an animated acquired by Netflix about Michael Jackson’s chimpanzee that Ray found while analyzing unproduced screenplays.

RivetAI’s 500 production companies’ customers will feed ever more data to its self-learning system to augment their storytelling efforts. Ray compares RivetAI to AutoCAD – software that began as a drafting tool and has become a central platform for many creative professionals. To that end, RivetAI is developing products for screenwriters, corporate branded content, series television and reality shows. 


A computer monitor with an image of a man on the screen. The man is standing in front of a green background.

A demonstration of the artificial intelligence software, Arraiy, being used to process green screen footage quickly.

Photo by Christie Hemm Klok for The New York Times.


Content Creation with an AI Assist

Computer-generated visual effects are widely used in blockbuster movies, TV shows and games. Sensei, Adobe’s AI, is now being deployed across the company’s cloud platforms to automate functions and provide intelligence. The more Sensei is used, the smarter it gets.

Also, 3D software giant AutoDesk is moving towards AI-assisted generative design, which the company used on its own new facility in downtown Toronto. Massive Software, which has been used on Peter Jackson’s massive CG films, now uses AI to automate crowd simulation and other time-sucking tasks. Its Ready to Run Agents are prefabricated AI agents that can be dropped into scenes by visual effects artists, saving time in the creation of CGI characters.

Arraiy is a well-funded Silicon Valley startup that uses computer vision and machine learning to automate time-consuming visual effects like rotosoping, to separate layers of an image to allow manipulation. The Black-Eyed Peas’ music video for their song, “Street Living,” utilized Arraiy to superimpose band members over images from the civil rights era.

The work involved in modeling, texturing, lighting, animation and performance will ultimately be automated with machine learning, says Derek Spears, Emmy Award-winning VFX artist for Game of Thrones. “Then, the next frontier will be AI-driven actor performances.”

Simulating People

We’ve seen Carrie Fischer exhumed into Star Wars movies, using past performances. Now we’re seeing the emergence of simulated video and voice. Rival Theory's RAIN AI creates human-like AI for more than 100,000 game developers and agencies. Lyrebird is a tool for the creation of artificial voices. Adobe has demoed Voco, a prototype that generates speech that sounds like a specific person.

Clarifai is a platform that uses “computer vision,” a form of machine learning, to help customers detect and predict demographics of faces, identify celebrities, and much more. Face2Face offers real-time facial capture and reenactment. Check out this clip of a speech by President Barack Obama which he never gave:

Software pioneer Marc Canter has developed a new AI-based storytelling platform called Instigate, which takes an Instagram or Snapchat story and adds intelligence and interactivity to create what he calls “beings” – who then can have content-enabled conversations with friends.

Canter, who developed Micromind Director multimedia authoring tools, sees Instigate as an AI authoring environment for a new form of storytelling. AI makes Instigate’s beings more intelligent than the standard-issue bots that perform repetitive pre-defined tasks. 

The Ubiquity of AI and ML

Over time, this new layer of AI/ML capabilities will become standard for every company and every product’s technology stack. It will generate billions of dollars for companies across the global business value chain. We can see how media businesses such as digital video, advertising, marketing and VR/AR are already fundamentally driven by AI and ML capabilities, as seen in these examples:

  • Digital Video: AI optimizes video encoding and delivery. Visual and pattern recognition automates editing and content creation. AI-based fingerprinting protects copyright and aids in licensing and micropayments. AI detects “anomalies” like piracy, violence, adult and fake content. AI will lead to almost real-time video quality assessment, which will lead to shorter timelines for content release. IBM’s Watson AI platform released what it called a cognitive movie trailer for the Fox film, Morgan, and has automated highlight reels for the World Cup and other sports events.
  • VR and AR: These applications depend on AI to create viable experiences, and are closely aligned with visual effects and game design. Cloud providers Google, Amazon and Microsoft, all of whom are committed to AR and VR as an engine of growth, are embedding AI into the platforms that will increasingly power immersive applications and experiences.

In the end, Hollywood is just like any other industry – as investor Ben Evans put it, “eventually, pretty much everything will have ML somewhere inside and no one will care.”


Two women standing face to face and between them is a glass wall.

The cognitive trailer for the AI thriller film Morgan was created with the help of IBM's AI platform, Watson.


Nicholas DeMartino is a Los Angeles-based media and technology consultant. He served as Senior Vice President of the American Film Institute. He has been part of the IDEABOOST team since its launch in 2012, now serving as chair of its Investment Advisory Group.


tags: