360 is not VR, but still cool

VR. 360. The next big thing? The same thing?

No, they’re quite different. Blame Apple’s Quicktime VR for mixing up the two. (Released in 1994, QTVR was essentially a single frame from any of today’s 360 videos, not truly a virtual reality environment.)

Raindance‘s Baptiste Charles-Aubert makes a pointed distinction: ‘360 is immersive as opposed to VR which is interactive‘:

“In a Virtual Reality setting, the viewer/player leads. With 360, as filmmakers, we need to keep control over the narrative and push it forward. That’s what happens with 360. The viewer is experiencing a story happening around him and not to him. The physical location then matters even more in 360.”

One neat example of 360 documentary filmmaking is a series of Paul McCartney ‘interviews’ by Jaunt.

Their 360 technology is impressive, with no discernible ‘stitching‘.

My take: If the selfie is the painted self-portrait digitized, what is the equivalent documentary or narrative film? Will it utilize VR or 360? Or are those technologies reserved for something else? The modern, introspective equivalent of album liner notes, perhaps?

A film festival for every film

With something like over 3,000 active film festivals around the world, there’s a film festival for every type of film.

And now, there’s even one dedicated to drone filming: FRiFF.

Filmmakers looking for validation from juries can search for suitable festivals on Withoutabox or FilmFreeway.

Just be sure festival exposure is a component of your overall distribution and marketing strategy.

My take: that’s it — I’m going to start my own film festival. Coming to cyberspace soon! Even if Robert Redford once said there are too many film festivals….

Lytro reveals revolutionary studio camera

Although you’ll never be able to afford one, Lytro introduced its Lytro Cinema Camera at NAB on April 19, 2016.

This is a huge studio camera with a foot-and-a-half-wide lens tethered to its own server farm. It captures “755 RAW Megapixels” at 300 fps in up to 16 stops of dynamic range.

That’s about 15 times more resolution than a full-frame DSLR at 50MP.

It doesn’t actually record images though. It captures the “light field” — the lightscape of reflected light rays in front of the lens. Behind the front lens, an array of microlenses allows Lytro to “capture a light field, compute the ray angles and then replicate that light field in a virtual space.”

In other words, this camera captures a virtual hologram of the scene in front of it.

With this computational model, Lytro can, after capture, i.e. “in post”:

  • refocus and change depth of field
  • adjust frame rate and shutter angle
  • pull a key based on depth and not green screen
  • stabilize camera movement based on actual movement in space
  • natively create 3D footage from one shot
  • as a DI, output optimal deliverables for any format

Watch the No Film School interview and video.

My take: With its Cinema Camera, Lytro has displaced image capture with lightscape hologram capture. If I was a Hollywood producer, I’d use this camera on 3D shoots and to simplify keys for composite work. And — to fix those pesky out-of-focus shots. But wait! There’s more! They’re also promising a Light Field VR Camera called Lytro Immerge.

Kodak announces a new Super 8 camera

At CES 2016 in Las Vegas last week, Kodak stunned the world by announcing it is making a new Super 8 camera for release this fall.

After emerging from bankruptcy two years ago, Kodak decided to go all in on film, even though film represents only 10% of its business.

Hollywood filmmakers, many who grew up shooting Super 8, convinced Kodak to bring back the narrow-gauge format.

Kodak believes Super 8 can join the Maker Movement and ride the analogue trend.

Check out the camera specs.

My take: I’m also one of the filmmakers who got their start shooting Super 8. I have two concerns with Kodak’s new camera. While the viewfinder and SD card are appreciated, what were they thinking with the microphone? Super 8 cameras are typically noisy! My other concern is with the jitter inherent in Super 8. Logmar of Denmark has solved this — but their camera costs ten times as much. What I do think is brilliant is Kodak getting back into the film processing business and combining it with film scanning. That combination is the real news here and could make more people consider shooting on Super 8. But only if your pockets are very deep or your shot list is (super) short.

The Most Technologically Advanced Book Ever Published

Chuck Salter writes in FastCoDesign about a publishing company that continues to innovate in the personal book field.

First came ‘The Girl Who Lost Her Name’ and ‘The Boy Who Lost His Name’. Now comes ‘The Incredible Intergalactic Journey Home’.

“This time, a lost boy or girl navigates his or her way from outer space back home. Spoiler alert: to the reader’s actual home. The wayward space ship swoops into his or her city and arrives in the child’s neighborhood. The image, the book’s big reveal, incorporates the corresponding satellite photos. That degree of personalization required even more algorithms and developers than Lost My Name’s first book, along with help from NASA, Microsoft, satellite makers, and other unlikely children’s book partners.”

The creators are Lost My Name of East London. What a wondrous book and a steal at $30.

My take: I love this concept and the marvellous execution! (The new book does remind me slightly of Arcade Fire‘s Chrome Experiment, The Wilderness Downtown, which may or may not be still working.) Now imagine this in the video realm. I see no reason, with the state of CGI, digital production and online streaming, that my likeness could not be inserted into productions and animated, for my entertainment only. Maybe not in real-time initially and probably not voice. But imagine your own channel on Netflix, starring or co-starring you! That might be fun.

To 4K or not to 4K

Near the beginning of your indie film project, you need to determine the video format you will be shooting.

Will it be 4K, or 2K, or — gasp — even 1080p?

This chart shows those display resolutions and many more.

More and more cameras shoot 4K so why even debate it?

There are some very good reasons not to shoot in 4K. Mentorless has three:

“#1 – Nobody Can Tell the Difference
#2 – It Will Stretch Your Budget And Take More Time
#3 – The Delivery Side Can’t Hold Its Part of the Contract (Yet)”

On the other hand, the main reasons touted to shoot in 4K are to allow for digital cropping, stabilization and zooming in post-production and to ‘future-proof’ your production, in anticipation of the day when everyone has 4K TVs and no caps on their Internet data.

My take: I like to shoot fast, so I’m partial to smaller form factors; 1080p is fine for me, for now. But the new DJI Osmo really caught my eye. The 4K footage is amazing and incredibly smooth for such a small camera.

Strides in VR filmmaking

Discovery has launched a new project: Discovery VR.

Although there are only 10 VR videos on the site right now, you can control each of them for a full 360 degrees with your mouse.

There’s diving with sharks, skateboarding in San Francisco and a surfing lesson.

I found that changing the control setting to Mouse Grab from the default Mouse Movement gave me more natural movement.

In addition to the website, there’s an app for iPhones and Android devices. Create a VR headset with Google Cardboard or Samsung Gear VR.

The company behind the magic is Littlstar.

My take: I remember the initial release of QuickTime VR in 1994 which gave me my first glimpses of ‘virtual reality’. GameSpot has an interesting history of VR. I think the application to narrative film will be fascinating. For instance, see Intimate Strangers : Chapter 1 — camera placement and mise en scene become very important. I like the way the ‘dream’ is projected onto the ceiling above the woman. A tip for VR directors, place the camera just to one side of the ‘line’ and let the viewer pan from one actor to the other and back.

U2 shows us the way with live mobile streaming

First Youtube enabled anyone to post moving images to the Internet, democratizing the movies.

Now mobile streaming apps are revolutionizing live broadcasting, once the domain of television.

Having just launched within the last three months, both Meerkat and Periscope enable anyone with a smartphone to stream live video broadcasts in realtime to the world.

Meerkat (IOS and Android) wants you to first log in to Twitter. The left column lists upcoming streams, comments are on the right and the stream is featured vertically in the middle. Meerkat loves the colour yellow.

Periscope (IOS and Adroid) was purchased by Twitter shortly after Meerkat debuted. Comments are superimposed in the bottom left-hand corner, and you can show some ‘love’ with hearts that float up the right side of the vertical screen.

You can search Twitter to find live Meerkat streams or live Periscope streams.

Or, New York digital & social agency, GLOW, offers two ways to sample multiple streams:

Rock band U2 have embraced Meerkat. During the current i+e Tour, according to The Hollywood Reporter,

“The band invites an audience member onto the B stage to shoot a stripped-down number — on this night, ‘Angel of Harlem’ — to be broadcast live via the fledgling Meerkat platform. ‘This goes out across the globe — to about 150 people, until it catches on,’ Bono quipped.”

My take: I think this is truly revolutionary. The ‘airwaves’ for traditional TV broadcasters are strictly controlled by the FCC in America and the CRTC in Canada. Now, everyone with a smartphone has a ‘TV’ camera in their pocket and can begin broadcasting to the world at any time, for free! Journalism and entertainment may never be the same again. Interestingly, both apps use a mobile-friendly vertical orientation, which is decidedly uncinematic.

Disney posits computer-aided editing

Disney researchers are working on an editing algorithm that edits footage from multiple cameras into coherent narratives.

It maps the common attention point in space for all the cameras as a proxy for the common subject. It then applies editing rules such as the 180 degree rule, jump cut avoidance and cutting on action — things your editor does now.

Their video is convincing.

See their take on interactive synchronization as well.

My take: this would be fascinating to see applied to news or documentary footage. It might also be applied to down-and-dirty multi-camera narrative work. The editor of the future’s job might evolve into finessing these cuts, choosing appropriate cutaways and organizing the order of scenes.

Hyperlapse solves shaky time-lapse footage

Fascinating news from Microsoft Research: we can fix your shaky GoPro time-lapse footage.

The technique is called Hyperlapse and they say an application is coming soon, perhaps in a few months.

Better yet, they show you how it’s done.

My take: thanks for sharing, Microsoft. I love that you’ve released the technical know-how as well. I predict a Google Street View-style multiple camera rig capturing overlapping footage to generate a rich ‘picture-scape’ combined with this software to create immersive, real-time, viewer-defined camera movement. Movies, meet video games.