Super Fast Screenplay Coverage

Jason Hellerman writes on No Film School that I Got My Black List Script Rated By AI … And This Is What It Scored.

Jason says, “An AI-driven program called Greenlight Coverage gives instant feedback on your script. You just upload it, and the AI software spits coverage back to you. It rates different parts of the script on a scale from 1-10 and then gives a synopsis, positive comments, and notes on what would make it better. The program even creates a cast list and movie comps, allowing you to have an AI question-and-answer session to ask specific questions about the script.”

His script Himbo that was on The Black List in 2022 and rated incredibly high by ScriptShadow scored 6/10 on Greenlight Coverage.

He concludes:

“The truth is, I could see a read like this coming from a human being. Is it the best coverage? No. But as someone who has tested many services out there, I felt it gave better coverage than some paid sites, which are hit-and-miss depending on the person who reads your script. I look at AI as a tool that some writers may decide to use. I was happy I tried this tool, and I honestly was surprised by the feedback of the coverage.”

My take: I also participated in the beta test of Greenlight Coverage and asked the creator Jack Zhang the following questions via email.

Michael Korican: For folks used to buying coverage for their scripts, what are the main features of Greenlight Coverage that set it apart?
Jack Zhang: The speed, accuracy, consistency as well as reliability. Also the ability to ask follow up questions that can provide guidance on how to pitch to investors and financiers, all the way to how to further develop certain characters. In the future, we will also include data from Greenlight Essentials.

MK: Writers sometimes wait weeks if not longer for coverage. How fast is Greenlight Coverage?
JZ: 15 mins to 2 hours when they first upload their screenplay, depending on their subscribed package. The follow up questions are answered instantly.

MK: In your testing of Greenlight Coverage, how have produced Hollywood scripts rated?
JZ: It’s a mixed bag; the universally critically acclaimed ones usually get a very high score 8.5 to 9+, like The Godfather, Shawshank, etc.  The bad ones like The Room got 3/10. It really depends on the screenplay and the film.

MK: Greenlight Coverage uses a neural network expert system; the coverage section is highly structured whereas the question section is open-ended. How is this done and what LLM does Greenlight Coverage use?
JZ: We are using large language models to power our back end and it is not one, but a few different ones as well as our proprietary model that was fine tuned based on industry veteran feedback.

MK: Why should those folks who hate AI give Greenlight Coverage a try for free?
JZ: I totally see where they are coming from and personally I also agree that in such a creative industry, the human touch is 100% needed. This is just a tool to help give quick feedback and an unbiased opinion on the screenplay. It is useful as another input to the script, but not the end all and be all for it.

btw, not to brag but Greenlight Coverage gave my latest script, The Guerrilla Gardeners, 8/10. Wanna produce it?

Netflix releases viewership data for the first time

Jason Hellerman reports on No Film School that Netflix Releases All Its Streaming Data for the First Time Ever.

He points out that this is a huge story because the “notoriously secretive Netflix has published all its streaming numbers for the public to see” for the first time.

Netflix will publish the What We Watched: A Netflix Engagement Report twice a year.

The report has four columns:

  1. Title, both original and licensed
  2. Whether the title was available globally
  3. The premiere date for any Netflix TV series or film
  4. Hours viewed

Some takeaways:

  • This six month timeframe aggregates 100 billion hours viewed.
  • Over 60% of the titles appeared on Netflix’s weekly Top 10 lists.
  • 30% of all viewing was for non-English content, mainly Korean and Spanish.

Here’s the Netflix media release.

Here’s their six-month 18,000+ row spreadsheet.

My take: the industry has always wanted more transparency from Netflix and I don’t think it’s a coincidence that this report comes on the heels on the writer and actor strikes. I would love to see someone take this information and cross-reference it with genres, formats and actors. Will other streamers follow with their data?

Now and Then: how the short doc started with audio interviews

Rosie Hilder writes on Creative Bloq all about How Oliver Murray made the 12-minute Now and Then, Last Beatles Song documentary.

Oliver Murray says,

“First of all, the most important thing for me was that it felt fresh and contemporary, so we started out by recording new audio interviews with the surviving members of the band, Sean Ono Lennon and Peter Jackson. It was important to record only audio because that’s my favourite way of getting intimate and conversational interview content.”

He adds,

“I took these interviews into the edit and made a kind of podcast cut of the story, which became our foundation for the timeline…. Interviews are always a big part of my process, and are where I start because more often than not the answers that you get to questions lead you somewhere you didn’t expect and change the course of the project, so I like to do those early. It’s always useful to start with audio because it’s also the most malleable and it’s possible to go back for pick up interviews. Archive footage or access (with a camera) to the people you’re talking to actually doing what they’re talking about is much harder to acquire.”

Rosie asks him, “What is your favourite part of the finished film?”

Oliver replies: The emotional climax of the film is definitely the moment where we get to hear John’s isolated vocal for the first time. It’s quite an emotional moment to hear him emerge from that scratchy demo.

My take: this confirms that sound is more important than picture, to me. I think it would have been nice to have the dates displayed on each film clip used because there are a lot, and they bounce around in time, from now and then.

YouTube’s Dream Track and Music AI Tools

Sarah Fielding of Engadget reports that YouTube’s first AI-generated music tools can clone artist voices and turn hums into melodies.

The YouTube technology is called Dream Track for Shorts.

The mock-up asks you for a text prompt and then writes lyrics, music and has a voice-cloned artist sing:

YouTube is also testing Music AI Tools:

This is all possible due to Google DeepMind’s Lyria, their most advanced AI music generation model to date.

Cleo Abram also explains the real issue with AI music: When do artists get paid?

My take: AI is just a tool — I use it occasionally and will be exploring it more in 2024. What we as a society really need to figure out is what everyone’s going to do, and how they’ll get (not earn) enough money to live, when AI and Robotics make paid work redundant.

Filmed on iPhone

Apple‘s October 30, 2023 Event closed with this end credit: “This event was shot on iPhone and edited on Mac.

That would be filmed using the iPhone 15 Pro Max with the free Blackmagic Camera App and Apple Log. Plus edited in Premier Pro and then graded and finalized in Davinci Resolve.

michael tobin adds further insight and commentary:

My take: Of course, now Apple has to film everything they do like this!

 

Technology saves “last Beatles song”

The Beatles have just released their last single, Now and Then.

It appears as a free VEVO music video as well as these for-purchase products: various versions of vinyl and cassette.

“Now and Then” was a demo John Lennon recorded in The Dakota in the late 1970’s. The main reason it’s The Beatles’ last single is because until now it was too hard to separate John’s vocals from the piano notes. Technology to the rescue:

Want to know more? Check out this Parlogram documentary.

My take: I like this video most when it starts incorporating images from “Then” with footage from “Now” viz. 1:47, 1:55, etc. I would have liked to have seen much more of this technique used. This is truly the visualization of Now and Then — show us more!

Unfortunately, I’m not overly enamoured with the song itself; I find it middling and melancholic. I also don’t like:

  • The graphics and the cover image — boring!
  • The first few shots of the video are over-sharpened and plop us in the “uncanny valley” — not a good start.
  • The lip sync is poor — if you’re going to use AI, why not go all the way and use AI to reshape the mouths for perfect sync?
  • I think they missed a great opportunity to have Paul and Ringo sing verses in their own voices. Again, why not go all in and use AI to voice clone George and have him sing a verse too?

As to “last singles” — I think they should give this treatment the last song the Beatles actually recorded together: The End. Although, after 60 years, perhaps it’s just time to move on.

Telefilm releases annual report for 2022-23

Telefilm Canada has release two annual reports: Full Screen 2022-2023 Annual Report and PROGRAM RESULTS FROM SELF-IDENTIFICATION DATA COLLECTION Fiscal 2022-2023 – First Edition.

Some highlights:

“We guide filmmakers every step of the way: from training and mentoring at the beginning of their journey, to supporting them in development and production, from helping them promote their projects to partners and digital platforms, to supporting them in theatrical and festival releases.”

  • The foreign production and production services sector accounts for approximately 57% of the total volume of screen-based content produced in Canada.
  • Canadian films’ share of screen time in movie theatres: 4.7%.
  • Telefilm administered a total of $158.7 million.
  • Quebec received 47% of total funding.
  • Ontario received 32% of total funding.

From the 2022-2023 self-identification data report:

  • “Telefilm granted 24% ($20 million) of total funding to projects in which one of the key creative positions was held by a person with a disability.”
  • “For producers with disabilities, the representation is at a quarter (25%) in the Talent to Watch Program.”

My take: there’s lots to celebrate in these reports. I just wish we could see more Canadian films on cinema screens in Canada.

Victoria’s connection to Letterboxd

On September 29, 2023, co-founder Matthew Buchanan of Letterboxd announced, “we have accepted an offer for Tiny to acquire a 60 percent stake in Letterboxd, securing the platform’s future as an independently run company and part of the Tiny stable.”

Matthew elaborates:

“Something else that began around this time, courtesy of Tumblr’s flourishing design community, was my friendship with Andrew Wilkinson and his brother Will, both involved with MetaLab. There was no-one paying attention to the design scene at that time who wasn’t aware of their digital agency in Victoria and the quality of work it was delivering. We first met in person at XOXO in Portland, then kept in touch, as we both continued to build: us with our studio and Letterboxd, and Andrew with MetaLab and then Tiny, which acquires and supports great, creative businesses.”

The Hollywood Reporter reports the deal at $30,000,000 USD.

My take: fascinating! I can’t help but think the world is smaller than I once thought.

Using AI for good

Alyssa Miller reports on No Film School that Deepfake Technology Could Help You Change Inappropriate Dialogue in Post.

Flawless AI‘s tools include TrueSync that can be used to replace dialogue or even change the spoken language, all the while preserving correct mouth and lip movement.

Flawless TrueSync from Flawless AI on Vimeo.

Lara Lewington from BBC‘s Click explores the technology well:

Flawless on BBC Click from Flawless AI on Vimeo.

Other Flawless AI tools include: AI Reshoot (“‘allows you to make dialogue changes, synchronizing the on-screen actors mouths to the new lines, without needing to go back to set”) and DeepEditor (“enables filmmakers to edit and transfer an actor’s performance from one shot, to another, even if the shots were filmed from different angles.”)

My take: this is powerful technology but not sure how I feel about Robert De Niro’s face speaking in German, German that some other actor is speaking. (Of course, the next iteration of this tech is to voice clone and use that to speak in German. But now we’re really offside.)

The Revolution Will Be Televised!

Blackmagic has introduced Digital Film for iPhone.

Blackmagic Camera unlocks the power of your iPhone by adding Blackmagic’s digital film camera controls and image processing! Now you can create the same cinematic ‘look’ as Hollywood feature films. You get the same intuitive and user friendly interface as Blackmagic Design’s award winning cameras. It’s just like using a professional digital film camera! You can adjust settings such as frame rate, shutter angle, white balance and ISO all in a single tap. Or, record directly to Blackmagic Cloud in industry standard 10-bit Apple ProRes files up to 4K! Recording to Blackmagic Cloud Storage lets you collaborate on DaVinci Resolve projects with editors anywhere in the world, all at the same time!”

The tech specs are impressive.

This is a great way to learn Blackmagic’s menu system.

It’s also a great way to get introduced to Blackmagic’s Cloud.

And it’s a great way to explore the Sync Bin in DaVinci Resolve.

Oh, and by the way, it’s free.

My take: Grant Petty mentions how multiple filmers at “protests” in this update could use Blackmagic Cameras on their iPhones and work with an editor to create almost instant news stories; I think this technique could also be used during concerts as well.

So, have things changed from fifty years ago? The revolution will be live.