Super Fast Screenplay Coverage

Jason Hellerman writes on No Film School that I Got My Black List Script Rated By AI … And This Is What It Scored.

Jason says, “An AI-driven program called Greenlight Coverage gives instant feedback on your script. You just upload it, and the AI software spits coverage back to you. It rates different parts of the script on a scale from 1-10 and then gives a synopsis, positive comments, and notes on what would make it better. The program even creates a cast list and movie comps, allowing you to have an AI question-and-answer session to ask specific questions about the script.”

His script Himbo that was on The Black List in 2022 and rated incredibly high by ScriptShadow scored 6/10 on Greenlight Coverage.

He concludes:

“The truth is, I could see a read like this coming from a human being. Is it the best coverage? No. But as someone who has tested many services out there, I felt it gave better coverage than some paid sites, which are hit-and-miss depending on the person who reads your script. I look at AI as a tool that some writers may decide to use. I was happy I tried this tool, and I honestly was surprised by the feedback of the coverage.”

My take: I also participated in the beta test of Greenlight Coverage and asked the creator Jack Zhang the following questions via email.

Michael Korican: For folks used to buying coverage for their scripts, what are the main features of Greenlight Coverage that set it apart?
Jack Zhang: The speed, accuracy, consistency as well as reliability. Also the ability to ask follow up questions that can provide guidance on how to pitch to investors and financiers, all the way to how to further develop certain characters. In the future, we will also include data from Greenlight Essentials.

MK: Writers sometimes wait weeks if not longer for coverage. How fast is Greenlight Coverage?
JZ: 15 mins to 2 hours when they first upload their screenplay, depending on their subscribed package. The follow up questions are answered instantly.

MK: In your testing of Greenlight Coverage, how have produced Hollywood scripts rated?
JZ: It’s a mixed bag; the universally critically acclaimed ones usually get a very high score 8.5 to 9+, like The Godfather, Shawshank, etc.  The bad ones like The Room got 3/10. It really depends on the screenplay and the film.

MK: Greenlight Coverage uses a neural network expert system; the coverage section is highly structured whereas the question section is open-ended. How is this done and what LLM does Greenlight Coverage use?
JZ: We are using large language models to power our back end and it is not one, but a few different ones as well as our proprietary model that was fine tuned based on industry veteran feedback.

MK: Why should those folks who hate AI give Greenlight Coverage a try for free?
JZ: I totally see where they are coming from and personally I also agree that in such a creative industry, the human touch is 100% needed. This is just a tool to help give quick feedback and an unbiased opinion on the screenplay. It is useful as another input to the script, but not the end all and be all for it.

btw, not to brag but Greenlight Coverage gave my latest script, The Guerrilla Gardeners, 8/10. Wanna produce it?

Kodak releases ‘new’ Super 8 camera — the price will surprise you!

Brian Hallett writing on ProVideo Coalition, was one of the first to report on The Brand New KODAK Super 8mm Film Camera.

Yes, this camera films on those boxy Super 8 cartridges, but it does so much more:

  • It has a 4″ LCD viewfinder with aspect ratio overlays, interactive menus and camera settings.
  • It has an extended gate, now 13.5:9 full frame (versus the traditional 4:3 or 12:9 aspect ratio) that will make cropping down to 16:9 a breeze.
  • It comes equipped with a detachable wide-angle 6 mm 1:1.2 C-mount lens but you can screw on any C-mount lens.
  • It records sound with an on-board sound recorder and via 3.5 mm input to an integrated SD card reader.
  • It runs crystal sync at 24 and 25 fps, plus over and under cranks at 18 and 36 fps.
  • It has a distinctive top handle and a pistol grip.

Here’s the full datasheet. And here’s their sizzle reel:

Curiously, Kodak first announced this camera in January 2016 at CES for “between $400 and $750.”

Funny, both the released camera and the prototype look an awful lot like the Logmar Humboldt S8.

Logmar’s next Super 8 camera, the Chatham S8, had a superior Latham loop mechanism that resulted in rock-steady registration, never seen before on Super 8. Check out this sample that actually looks like 16 mm footage:

Logmar’s latest C-mount Super 8 camera is the Gentoo GS8 that “uses standard Kodak 50 ft cartridges in combination with a re-usable spacer providing true pin registration.”

Oh, and by the way, the MSRP of Kodak’s “new” Super 8 camera is only $5,495 USD. That does include one cartridge of Super 8 film. No word on if that includes processing though.

My take: [shakes head] Kodak, Kodak, Kodak. You’re just trying to get this project off your books, right?

Finally, new plots unlocked for Hallmark movies!

Ryan Morrison of Tom’s Guide, reveals I asked ChatGPT to create a Hallmark Christmas movie — and it went better than expected.

He begins:

“Part of my job is testing AI products to find out how well they work, what they can be used for and just how good they are at different tasks. So, inspired by my mom’s favorite genre of movie I decided to ask ChatGPT to write a Christmas story in the style of Hallmark.”

His ChatGPT 4 prompt? “Can you help me come up with the plot for a Hallmark-style Christmas movie?”

The resulting basic plot? “In ‘Christmas Carousel’, a New York architect discovers love and the value of tradition when she teams up with a local carousel restorer to save a cherished holiday attraction in a small town.”

There’s a more detailed plot, characters and even dialogue.

The AI even suggests filming in Cold Spring, New York State.

Oops! There is a real 2020 Hallmark movie called “A Christmas Carousel” with this plot: “When Lila is hired by the Royal Family of Ancadia to repair a carousel, she must work with the Prince to complete it by Christmas.”

My take: even though it appears ChatGPT 4 came extremely close to ripping off the title of an existing Hallmark movie, I like its plot better than the real one. Go figure.

YouTube’s Dream Track and Music AI Tools

Sarah Fielding of Engadget reports that YouTube’s first AI-generated music tools can clone artist voices and turn hums into melodies.

The YouTube technology is called Dream Track for Shorts.

The mock-up asks you for a text prompt and then writes lyrics, music and has a voice-cloned artist sing:

YouTube is also testing Music AI Tools:

This is all possible due to Google DeepMind’s Lyria, their most advanced AI music generation model to date.

Cleo Abram also explains the real issue with AI music: When do artists get paid?

My take: AI is just a tool — I use it occasionally and will be exploring it more in 2024. What we as a society really need to figure out is what everyone’s going to do, and how they’ll get (not earn) enough money to live, when AI and Robotics make paid work redundant.

Filmed on iPhone

Apple‘s October 30, 2023 Event closed with this end credit: “This event was shot on iPhone and edited on Mac.

That would be filmed using the iPhone 15 Pro Max with the free Blackmagic Camera App and Apple Log. Plus edited in Premier Pro and then graded and finalized in Davinci Resolve.

michael tobin adds further insight and commentary:

My take: Of course, now Apple has to film everything they do like this!

 

Using AI for good

Alyssa Miller reports on No Film School that Deepfake Technology Could Help You Change Inappropriate Dialogue in Post.

Flawless AI‘s tools include TrueSync that can be used to replace dialogue or even change the spoken language, all the while preserving correct mouth and lip movement.

Flawless TrueSync from Flawless AI on Vimeo.

Lara Lewington from BBC‘s Click explores the technology well:

Flawless on BBC Click from Flawless AI on Vimeo.

Other Flawless AI tools include: AI Reshoot (“‘allows you to make dialogue changes, synchronizing the on-screen actors mouths to the new lines, without needing to go back to set”) and DeepEditor (“enables filmmakers to edit and transfer an actor’s performance from one shot, to another, even if the shots were filmed from different angles.”)

My take: this is powerful technology but not sure how I feel about Robert De Niro’s face speaking in German, German that some other actor is speaking. (Of course, the next iteration of this tech is to voice clone and use that to speak in German. But now we’re really offside.)

The Revolution Will Be Televised!

Blackmagic has introduced Digital Film for iPhone.

Blackmagic Camera unlocks the power of your iPhone by adding Blackmagic’s digital film camera controls and image processing! Now you can create the same cinematic ‘look’ as Hollywood feature films. You get the same intuitive and user friendly interface as Blackmagic Design’s award winning cameras. It’s just like using a professional digital film camera! You can adjust settings such as frame rate, shutter angle, white balance and ISO all in a single tap. Or, record directly to Blackmagic Cloud in industry standard 10-bit Apple ProRes files up to 4K! Recording to Blackmagic Cloud Storage lets you collaborate on DaVinci Resolve projects with editors anywhere in the world, all at the same time!”

The tech specs are impressive.

This is a great way to learn Blackmagic’s menu system.

It’s also a great way to get introduced to Blackmagic’s Cloud.

And it’s a great way to explore the Sync Bin in DaVinci Resolve.

Oh, and by the way, it’s free.

My take: Grant Petty mentions how multiple filmers at “protests” in this update could use Blackmagic Cameras on their iPhones and work with an editor to create almost instant news stories; I think this technique could also be used during concerts as well.

So, have things changed from fifty years ago? The revolution will be live.

What is the difference between USB-C and Thunderbolt?

Now that Apple has brought a USB-C port to the iPhone 15, it’s time to review how USB-C differs from Thunderbolt.

Similarities:

  • They look alike.
  • They are compatible with each other.

How to identify similar cables:

  • A Thunderbolt connector will have a lightning bolt symbol on it.
  • No lightning bolt? Then it’s a USB-C connector.

Keep in mind:

  • Use Thunderbolt cables between Thunderbolt devices to get the fastest transfer speeds.
  • Thunderbolt 4 runs at 40Gbps, up to four times faster than USB-C.
  • USB-C cables marked SS (for Super Speed) are faster than USB 2.0 cables.

(See Cult of Mac for more.)

My take: turns out, a cable is not just a cable! I wonder if there’s a plug in device to tell you if your cable is legit and what speeds it supports. But whatever you do, don’t check out this video if you’re paranoid.

Sony tests cameraless virtual production

Jourdan Aldredge notes on No Film School of Sony Testing Real-Time Cameraless Production for New Ghostbuster Movie.

He writes:

“Sony Pictures Technologies has unveiled its latest developments in real-time game engine technology with this new proof-of-concept project…. Its “cameraless” virtual production style… intends to allow developers to use this real-time game engine to produce a scene live on a motion capture set.”

Jason Reitman, who wrote and directed the two-minute scene in one day, says:

“I love filmmaking in real places with real actors. So for me, this is not a substitute. If I were to make Juno again today, I would make Juno exactly the same way. What I see here, what thrills me, is if I wanted to make a movie like Juno that took place in ancient Rome, I wouldn’t be able to do that because it would cost $200 million to make Juno. And that’s not cost effective. There’s no studio who would sign up for that. You can make Ghostbusters for a lot of money, but you can’t make an independent film in an unexpected location. You can’t make an independent film underwater, on the moon or, you know, a thousand years ago or into the future, and what thrills me about it is the possibility of independent filmmakers who want to tell their kind of stories, but in environments that they don’t have access to with characters that they don’t have access to, and the possibility of getting a whole new wave of stories that you could read in a book, but you would never actually get in a film.”

My take: While I agree with Jason Reitman that this technology is promising, I think their finished scene is underwhelming. It’s just not believable. For instance, the folks on the sidewalks are obviously from a video game. The traffic is not real world either. And the actor is not human; he’s a marshmallow! However, this might be where superhero comic book movies are going: totally computer-generated, with the faces of the stars composited onto the quasi-lifelike animation. (My nightmare situation: those faces and voices are AI generated from scans and recordings!)

AI delivers on “Fix it in post!”

Michael Wynne, an audio mastering engineer, just claimed on his In The Mix channel: I Found The Best FREE AI Noise Reduction Plugin in 2023.

The tool is a free AI de-noising and re-reverb plugin called GOYO by South Korea’s Supertone AI.

Michael begins:

“I’ve used many of the sort of more premium and expensive dialogue, restoration and denoising softwares, and those are very good, but I haven’t come across a free tool that even comes close to what is offered by those. So I was really curious, downloaded it, tried it in my DAW and video editor, and was just completely shocked by the results.”

There are three dials: Ambience, Voice and Voice Reverb. You can solo, mute, decrease or increase each band. Simple and powerful!

His expert tip:

“My favourite way to use these sorts of tools is to dial them in and then print the results. So I would control this to the amount I’d like. I would export that as a new WAV file, take a listen and then work with that so that I know that the next time I open up the session, it’s going to be exactly the same.”

The How to Use page from Supertone is quite sparse so Michael’s examples are great.

My take: the perennial joke on set has always been, “We’ll fix it in post.” Well, now that’s possible for sound! I’ve used this on a short and can attest to its ease of use and incredible results. I concur with Michael that it’s best to print out each voice track as a WAV file and re-synchronize it to the timeline because I found that either the effect did not persist between sessions or the values had reset to zero or the effect was present but the numbers displayed as zero. My other tip is to only use the graphical user interface (and do not use the Inspector) as this seemed to work best. After all, this is a free beta!