Distribution rules and streaming steadily impacting small theatres

Kevin Maimann writes on CBC that Canada’s indie movie theatres say industry is in crisis.

He begins by stating: “Canada’s independent cinema industry is in crisis, its owners say, as they face mounting challenges from streaming services and restrictive Hollywood studio rules.”

It turns out that “rules imposed by major studios like Disney determine when and for how long they are able to screen certain big-ticket films.”

These are called “‘clean runs,’ when studios require an independent theatre to dedicate a screen to just one film for up to four weeks, even if the film stops drawing crowds after the first week. This can be especially frustrating for small-town theatres that only have one screen.”

And they can’t rent their theatre out during that time.

They said if your door’s open, you’re showing our product.

Another studio constraint is “zone provisions, which keep exhibitors from playing films that are screening at bigger nearby theatres.

By the time they’re allowed to screen some films, they could be streaming.

See the Network of Independent Canadian Exhibitors report.

My take: could the studios and their distributors be actively trying to kill off independent cinemas? I doubt it. But small town cinemas are so much more than just Hollywood outposts; they are often the only local in-person cultural and artistic hub available to citizens.

Warner Bros. shelves “Coyote vs. Acme”

Joe Hernandez reports on NPRHere’s why some finished films are mothballed.

He begins, “Back in November [2023,] Warner Bros. Discovery announced it was not planning to release “Coyote vs. Acme,” a hybrid animated and live-action comedy starring John Cena and Will Forte that had wrapped filming a year earlier.”

And then adds that the studio shelved both “Batgirl” and “Scoob!: Holiday Haunt” in 2022.

He explores the reasons behind spending $70M, $90M and $40M and then shelving movies rather than releasing them to the public:

  • Money: “Hollywood financial experts say that when studios scrap finished projects the decision usually comes down to money.”
  • New Directions: “Abandoning a project may also reflect the shifting priorities of a studio.”

He concludes, “Though it may make financial sense for a studio to abandon a film, that argument may prove little comfort to the movie’s cast and crew or the fans eagerly awaiting its release.”

My take: Here’s his description of the cancelled movie: “Based on a satirical New Yorker piece, the movie followed Wile E. Coyote as he sued the Acme company after its products again and again fail to help him catch the elusive Road Runner.” That sounds hilarious! I’d watch that. And it seems cruel to green light a movie, get folks to spend years of time and effort working on it and then pull the rug out from under everything. Beep! Beep!

ScriptReader.ai offers script analysis for $9.99

Jason Hellerman reports on No Film School that AI Says My Screenplay is Better Than ‘Silence of the Lambs’ and ‘Schindler’s List’.

He begins, “This morning, a really lovely NFS reader reached out to me and wanted my opinion on a program called ScriptReader.AI, which charges $10 for a complete screenplay breakdown.”

He uploads one of his scripts and in two hours gets extensive coverage. It appears the AI’s method is to rate each scene and then average the scores. His script gets a “Recommend.”

However, Jason is not impressed. He concludes:

“Computers cannot think and feel. They have zero sense of nuance. And you cannot take their coverage seriously. No matter how nice it is. AI coverage is worthless!”

My take: the ScriptReader.ai website woefully undersells its capabilities. It certainly looks sketchy. But you can test out three scenes for free.

And then there’s a link in the top right corner of the website labelled “SHALL WE PLAY A GAME?_” I recognized this as a reference to WarGames in which Matthew Broderick unwittingly provokes WWIII.

With some trepidation, I clicked on the link.

This brought me to a page titled “SceneProv_” with two modes, Fun and Training.

I decided to have some fun and generate a scene for a RomCom. The AI, suddenly my writing partner, set the scene and populated it with two characters. My interactions are bolded.

“Genre selected: romance

INT. COFFEE SHOP – DAY

JEN sits at a table with a book, sipping on her latte. She notices AIDEN walk in and their eyes meet.

JEN:
(smiling)
Hey!

AIDEN:
(nervously)
Hi! Can I sit here?

JEN:
(surprised)
Uh, sure!

AIDEN sits across from Jen and they strike up a conversation.

AIDEN:
I noticed you’re reading “Do Androids Dream of Sheep?”.

JEN:
Yup. It’s my favourite book.

Her eyes glitch.

AIDEN:
Do you want to go back to my place?

JEN:
(surprised)
Excuse me?

AIDEN:
(smiling)
Not like that. I have a first edition copy of that book. I’d love to show it to you.

JEN:
(relieved)
Oh, sure. That sounds great.

INT. AIDEN’S PLACE – NIGHT

Aiden and Jen are sitting on the couch, looking at the first edition. Her eyes glitch again.

AIDEN
Jen, why do your eyes “glitch?”

JEN
What do you mean?

AIDEN
Well, every once in a while they kinda spark.
Are you hiding something from me?

Jen looks at Aiden angrily and stands up quickly.

JEN:
(angrily)
No, I’m not hiding anything! It’s just a glitch in my synthetic retinas. I’m an android.

AIDEN:
(surprised)
An android? I had no idea.

JEN:
(sadly)
I know, it’s not exactly a desirable trait in the dating world.

AIDEN:
(smiling)
Well, I’m not like most guys. I think it’s pretty cool, to be honest.

Jen’s eyes glitch again, this time in surprise.

JEN:
(surprised)
Really?

AIDEN:
(nodding)
Really. Plus, I always thought I’d make a great cyborg myself.

Jen smiles and takes Aiden’s hand, feeling grateful for the first time to have found someone who accepts her for who she really is.

They continue to look at the book, their fingers touching.

Aiden turns to Jen, looking into her eyes.

AIDEN:
(softly)
You know, I may not be able to dream like you do, but
I have a feeling this is the beginning of something special.

JEN:
(smiling)
I have that feeling, too.

They lean in and share a kiss, the start of their unconventional love story.”

And then it rated “our” scene a B+. This AI passes the Turing Test for sure.

Hollywood A-Listers Buy Westwood Village Theatre

According to Pat Saperstein of Variety, Jason Reitman Acquires Fox Village Westwood Theater With Filmmakers Including Steven Spielberg, Christopher Nolan, JJ Abrams, Chloé Zhao.

“Jason Reitman has gathered more than two dozen filmmakers to help acquire Westwood’s historic Village Theater, which will program first-run and repertory programming.”

The 36 investors include:

  1. JJ Abrams
  2. Judd Apatow
  3. Damien Chazelle
  4. Chris Columbus
  5. Ryan Coogler
  6. Bradley Cooper
  7. Alfonso Cuarón
  8. Jonathan Dayton
  9. Guillermo del Toro
  10. Valerie Faris
  11. Hannah Fidell
  12. Alejandro González Iñárritu
  13. James Gunn
  14. Sian Heder
  15. Rian Johnson
  16. Gil Kenan
  17. Karyn Kusama
  18. Justin Lin
  19. Phil Lord
  20. David Lowery
  21. Christopher McQuarrie
  22. Chris Miller
  23. Christopher Nolan
  24. Alexander Payne
  25. Todd Phillips
  26. Gina Prince-Bythewood
  27. Jason Reitman
  28. Jay Roach
  29. Seth Rogen
  30. Emma Seligman
  31. Brad Silberling
  32. Steven Spielberg
  33. Emma Thomas
  34. Denis Villeneuve
  35. Lulu Wang
  36. Chloé Zhao

“The Fox Village, built in 1930, has hosted hundreds of premieres over the past 90 years, including Reitman’s own “Juno,” “Licorice Pizza” and many others…. The distinctive Spanish mission revival-style building is topped by a 170-foot neon-lit tower, making it a beacon for filmgoers on the Westside of Los Angeles.”

My take: interestingly, at least four on this list are Canadians: Jason ReitmanSeth RogenEmma Seligman and Denis Villeneuve. I’m glad this historic cinema is being saved.

GAME OVER! OpenAI’s SORA just won the text-to-video race

The inevitability of script-to-screen technology is closer than ever.

OpenAI released test footage and announced, “Introducing Sora, our text-to-video model. All the clips in this video were generated directly by Sora without modification. Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions.”

See openai.com/sora for more.

“Sora is able to generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background. The model understands not only what the user has asked for in the prompt, but also how those things exist in the physical world.”

“The model has a deep understanding of language, enabling it to accurately interpret prompts and generate compelling characters that express vibrant emotions. Sora can also create multiple shots within a single generated video that accurately persist characters and visual style.”

See the Technical Research.

Beyond text-to-video, “Sora can also be prompted with other inputs, such as pre-existing images or video. This capability enables Sora to perform a wide range of image and video editing tasks — creating perfectly looping video, animating static images, extending videos forwards or backwards in time, etc.”

Sora can even replace the whole background in a video: “Diffusion models have enabled a plethora of methods for editing images and videos from text prompts…. One of these methods, SDEdit,32… enables Sora to transform the styles and environments of input videos zero-shot.”

My take: this is powerful stuff! Workers in media industries might want to start thinking about diversifying their skills….

Telefilm eyes feature films in Canada

Telefilm Canada has just published a report on Canadian Movie Consumption – Exploring the Health of Feature Film in Canada.

The study, by ERm Research, provides an “understanding of overall consumption patterns, media sources used by audiences, their decision-making process, genre preferences, barriers to watching more movies, and their theatrical moviegoing habits, as well as perceptions of Canadian content.” The study contacted 2,200 feature film consumers in Canada from September 17 to October 2, 2023.

Three of the report’s findings:

  1. 95% of Canadians aged 18+ have seen one or more feature films in the past year, with nearly three-quarters seeing a movie in theatres.
  2. Paid streaming accounts for 54% of all feature film consumption. Around nine in ten movie consumers use at least one streaming service, with most accessing multiple.
  3. French Canadian movie watchers are more inclined to see Canadian content theatrically and generally have a higher opinion of Canadian films.

Some things that stood out to me:

  1. 55% of the audience on opening nights are under the age of 35 whereas by the second week 50% of the audience is 45 or older. (Page 33.)
  2. Canadian moviegoers see on average only 1.4 feature films annually. (Page 38.)
  3. The top five streamers in Canada are Netflix (67%,) Amazon Prime (50%,) Disney+ (39%,) Crave (21%) and Apple TV+ (12%.) (Page 35.)

You can download the full report here.

My take: not very encouraging. I think we need to take our cue from the Quebecois who see (and like) more Canadian films. Why is that? The obvious answer is that they’re watching French-language films, fare that Hollywood is not producing. A more nuanced answer is that they’re watching films that reflect life in their province. Unfortunately, because Canadian movies have highly limited access to cinema screens in the rest of Canada, Canadians outside of Quebec don’t have that luxury.

See every Canadian movie!

If your New Year’s resolution is to watch more Canadian films, Telefilm has you covered.

Their See It All website will help you discover Canadian movies, new (2023) and old (1973).

You can search the database of over 3,400 by title, by new releases and by streaming platforms.

My take: I wish we could search by director or cast members too!

Super Fast Screenplay Coverage

Jason Hellerman writes on No Film School that I Got My Black List Script Rated By AI … And This Is What It Scored.

Jason says, “An AI-driven program called Greenlight Coverage gives instant feedback on your script. You just upload it, and the AI software spits coverage back to you. It rates different parts of the script on a scale from 1-10 and then gives a synopsis, positive comments, and notes on what would make it better. The program even creates a cast list and movie comps, allowing you to have an AI question-and-answer session to ask specific questions about the script.”

His script Himbo that was on The Black List in 2022 and rated incredibly high by ScriptShadow scored 6/10 on Greenlight Coverage.

He concludes:

“The truth is, I could see a read like this coming from a human being. Is it the best coverage? No. But as someone who has tested many services out there, I felt it gave better coverage than some paid sites, which are hit-and-miss depending on the person who reads your script. I look at AI as a tool that some writers may decide to use. I was happy I tried this tool, and I honestly was surprised by the feedback of the coverage.”

My take: I also participated in the beta test of Greenlight Coverage and asked the creator Jack Zhang the following questions via email.

Michael Korican: For folks used to buying coverage for their scripts, what are the main features of Greenlight Coverage that set it apart?
Jack Zhang: The speed, accuracy, consistency as well as reliability. Also the ability to ask follow up questions that can provide guidance on how to pitch to investors and financiers, all the way to how to further develop certain characters. In the future, we will also include data from Greenlight Essentials.

MK: Writers sometimes wait weeks if not longer for coverage. How fast is Greenlight Coverage?
JZ: 15 mins to 2 hours when they first upload their screenplay, depending on their subscribed package. The follow up questions are answered instantly.

MK: In your testing of Greenlight Coverage, how have produced Hollywood scripts rated?
JZ: It’s a mixed bag; the universally critically acclaimed ones usually get a very high score 8.5 to 9+, like The Godfather, Shawshank, etc.  The bad ones like The Room got 3/10. It really depends on the screenplay and the film.

MK: Greenlight Coverage uses a neural network expert system; the coverage section is highly structured whereas the question section is open-ended. How is this done and what LLM does Greenlight Coverage use?
JZ: We are using large language models to power our back end and it is not one, but a few different ones as well as our proprietary model that was fine tuned based on industry veteran feedback.

MK: Why should those folks who hate AI give Greenlight Coverage a try for free?
JZ: I totally see where they are coming from and personally I also agree that in such a creative industry, the human touch is 100% needed. This is just a tool to help give quick feedback and an unbiased opinion on the screenplay. It is useful as another input to the script, but not the end all and be all for it.

btw, not to brag but Greenlight Coverage gave my latest script, The Guerrilla Gardeners, 8/10. Wanna produce it?

Netflix releases viewership data for the first time

Jason Hellerman reports on No Film School that Netflix Releases All Its Streaming Data for the First Time Ever.

He points out that this is a huge story because the “notoriously secretive Netflix has published all its streaming numbers for the public to see” for the first time.

Netflix will publish the What We Watched: A Netflix Engagement Report twice a year.

The report has four columns:

  1. Title, both original and licensed
  2. Whether the title was available globally
  3. The premiere date for any Netflix TV series or film
  4. Hours viewed

Some takeaways:

  • This six month timeframe aggregates 100 billion hours viewed.
  • Over 60% of the titles appeared on Netflix’s weekly Top 10 lists.
  • 30% of all viewing was for non-English content, mainly Korean and Spanish.

Here’s the Netflix media release.

Here’s their six-month 18,000+ row spreadsheet.

My take: the industry has always wanted more transparency from Netflix and I don’t think it’s a coincidence that this report comes on the heels on the writer and actor strikes. I would love to see someone take this information and cross-reference it with genres, formats and actors. Will other streamers follow with their data?

Now and Then: how the short doc started with audio interviews

Rosie Hilder writes on Creative Bloq all about How Oliver Murray made the 12-minute Now and Then, Last Beatles Song documentary.

Oliver Murray says,

“First of all, the most important thing for me was that it felt fresh and contemporary, so we started out by recording new audio interviews with the surviving members of the band, Sean Ono Lennon and Peter Jackson. It was important to record only audio because that’s my favourite way of getting intimate and conversational interview content.”

He adds,

“I took these interviews into the edit and made a kind of podcast cut of the story, which became our foundation for the timeline…. Interviews are always a big part of my process, and are where I start because more often than not the answers that you get to questions lead you somewhere you didn’t expect and change the course of the project, so I like to do those early. It’s always useful to start with audio because it’s also the most malleable and it’s possible to go back for pick up interviews. Archive footage or access (with a camera) to the people you’re talking to actually doing what they’re talking about is much harder to acquire.”

Rosie asks him, “What is your favourite part of the finished film?”

Oliver replies: The emotional climax of the film is definitely the moment where we get to hear John’s isolated vocal for the first time. It’s quite an emotional moment to hear him emerge from that scratchy demo.

My take: this confirms that sound is more important than picture, to me. I think it would have been nice to have the dates displayed on each film clip used because there are a lot, and they bounce around in time, from now and then.