Super 8 is about to make a comeback

The consumer film format called Super 8 was dominant in the sixties, seventies and eighties until the upstart technology called ‘video’ challenged it in the nineties and vanquished it from the marketplace in the new millenium. HD video now rules. With the right lens use and lighting, we can shoot economical, cinematic images.

Nevertheless, are you nostalgic for the real film look? It’s too expensive to actually shoot on film, right? 35mm, even 16mm, is out of reach. But what about Super 8? Is it possible to shoot on Super 8 and transfer to video for post?

My memory of the look of Super 8 is slightly soft, jittery Kodak Kodachrome, with it’s very warm tone and super-saturated reds. I shot my first films on Super 8, physically splicing the shots together and projecting the original reversal stock which would jump slightly as the cuts chattered through the projector gate.

One of Super 8’s strengths was also one of its weaknesses. The cartridges were extremely user-friendly but their design meant that the film was held steady during exposure by a simple pressure plate. Jitter, therefore, was built into all Super 8 cameras.

Now, a Danish company called Logmar plans to re-engineer the Super 8 camera. Their idea is to pull the film out of the cartridge and pin register it during exposure. The footage is rock-steady.

What about Super 8 film and developing? North American rights, film and processing will be handled by Pro8mm of Burbank, California.

My take: at 5 grand, this will be an expensive camera. I love the modern technology Logmar is brining to a mid-century medium, like the digital monitor and SD sound recording. Neat that they can scale this up to 16mm and 35mm as well. And I love the discipline of film versus video. But film! I thought it was dead! That sample footage does look more like 16mm than the Super 8 I remember. Perhaps if they address the dust on the negative and the dirt in the gate by the frame lines….

‘Sharknado 2’ can control your lights

Whether or not you appreciated the concept behind last year’s ‘Sharknado’ — sharks falling out of the sky — you should appreciate a cool technological tie-in tonight’s outing brings. (Syfy, Space 9 e, 6 p)

According to Mashable, ‘Sharknado 2: The Second One’ — now set in New York instead of LA — will be able to control your lights. Think flashing during lightning and drenching your room in red during the shark attacks.

That is, if you have Philips hue lights. Combining LED lights, the Internet and smart phone control gives you ‘personal wireless lighting.’

Plus, you need the Syfy Sync app on your smart device.

“The secret sauce of the whole experience is the Syfy Sync app, which typically brings the viewer second-screen information, such as actor profiles and trivia. Similar to Shazam, the app uses audio tagging to identify what the viewer is watching, delivering the right content at the right moment. But the Philips integration takes it to another level.”

My take: kinda cool. With 5.1 sound and responsive lighting, in the proper hands, this could make for very immersive experiences.

A brave new world envisioned by — Disney

The Washington Post reports that Disney is working on a touchscreen that lets you feel textures.

“The technology is called ‘tactile rendering of 3D features,’ and an early version of a rendering algorithm has already been developed by engineers at Disney Research in Pittsburgh. The process behind it is, predictably, both technical and confusing, but the basic premise is that small, electronic pulses can trick your fingers into perceiving bumps and texture, even if the surface is actually flat.”

The right amount of voltage makes you feel ridges, edges, protrusions or bumps!

Check out the video.

My take: This reminds me of the feelies in Aldous Huxley’s Brave New World. Science fact catches up to science fiction!

DIY DCP

If, after reading last week’s post about exhibition formats, you really want a DCP but lack the budget to get one professionally made, you’re in luck.

Danny Lacey has created a tutorial for OpenDCP, open software to create Digital Cinema Packages.

Danny says,

I’ll tell you what, this is going to open a lot of doors for Indie film makers too, I believe it’s going to be incredibly helpful for those going down the self distribution route. It’s quite simply chopping down the prices and expense of delivering your movie.

The 27-minute video has all the details but in a nutshell:

  1. Export your film as a 16-bit TIFF sequence.
  2. Use free, open source DCP software to convert the TIFF sequence into JPEG 2000
  3. The DCP software then wraps the video (JPEG2000) and audio (WAV) in to MXF files.
  4. The final stage is creating the DCP which generates 6 files that will be recognised by a DCP server.

My take: well worth the watch!

Google Glass: everyone is a cinematographer now

According to VentureBeat, the first arrest has been witnessed by Google Glass.

Documentary filmmaker Chris Barrett glassed it in Atlantic City on July 4.

“I picked up my Google Glass explorer edition last week. I wanted to test Glass out, so I filmed some fireworks, getting a very cool first-person perspective. About 10 minutes after the fireworks, we were walking back to our car, and I just decided to try it out on the boardwalk.”

Watch the footage.

My take: welcome to the voyeur world, where everyone is a cinematographer. Right now, the public is unaware and continues to “act natural”. But will behaviour change? Has it changed with CCTV and cellphone video? Very soon, documentaries will look very different when everyone has their own Glass. Editing may be replaced with web-based crowd-sourced Glass-fueled media streams, like Switchcam.

Hyperaudio Pad: cut and paste transcript video editing

One of the problems with digital film is that it’s too damn easy to just shoot and shoot and shoot.

Great! But then, when it’s time to edit, you’re faced with hours and hours and hours of footage.

Just logging it can take days.

A new technology promises to revolutionize this and more: Hyperaudio Pad.

Twenty-two final round winners in the Mozilla Ignite Challenge were just announced. One of them is Hyperaudio Pad by Mark Boas et al.

The creators think they’ve built a language learning tool. I think it’s a whole new way to edit video.

What’s so cool about it?

  • It transcribes footage and automatically makes transcripts.
  • You can cut and paste the words to create visual sequences.

See the web page and the pitch video.

Most importantly, see the application they made for Al Jazeera English. Search for ‘nuclear’ and then click on each of the 35 tiny red and blue squares.

My take: imagine letting Hyperaudio Pad loose on your footage. How great would your next documentary be?

Note: AVID has something similar called ScriptSync.

Get ready for 8K!

On the heels of the massive 4K consumer television released by Sony, the Japanese have done it again.

NHK, Japan’s public broadcaster, will screen the first 8K film in Cannes later this month.

“8K Super Hi-Vision was developed by NHK and delivers a resolution of 7,680 by 4,320 pixels – four times the resolution of 4K and 16 times that of HD – as well as featuring 22.2 multichannel sound.”

Film theatres are just finishing their transition to digital projectors capable of 4K.

My take: I think technology is about to surpass biology; going to see a film in a movie theatre might one day be a sharper and clearer experience than anything you could see in real life. We may never go outside again!

The Future of Digital Cameras, According to Filmmaker Magazine

David Leitner of Filmmaker Magazine has published the future of digital cameras, as he sees it.

“A motion picture camera used to be a light-sealed box with a strip of film running through it…. Today’s cameras are exponentially more complex. They are literal bundles of separate technologies, each lurching forward at a different rate. To understand today’s cameras, you must understand the parts to understand the whole.”

He looks at:

  • Pixel Count (4K is here)
  • Sensors (Super 35 with incredible ISO ratings)
  • Lenses (18-200mm with zoom servos)
  • Media (Solid state)
  • Frame Rates (120 to 240 to 480 to 960 for slow motion)
  • Compression and Latitude (H.265 and RAW)
  • Camera Design and Control (bonded cellular, anyone?)
  • Workflow and Post (64-bit and the free DaVinci Resolve Lite)
  • New Cameras in 2013 (including the For-A FT-ONE with 4K capture at 900 fps)

Well worth the long read.

My take: film is dead!

4K consumer TVs are here; bring cash

Well, when I say ‘consumer’ I mean your very, very rich uncle. And you might need a wheelbarrow for all that cash.

Sony has unveiled a $25K television that has four times the resolution of your measly Full HD 1080 flat-screen TV.

The XBR-84X900 4K Ultra HD TV has a native resolution of 3,840 x 2,160 pixels packed into an 84 inch screen. There is virtually no dot crawl! Sony says:

“Sony is a leader in 4K digital cinematography and projection. As of June 2012, there were over 13,000 Sony 4K digital cinema theaters — nearly 75% of all 4K theaters worldwide use Sony 4K digital cinema projectors. And now Sony brings the full 4K digital experience into the home with stunning picture quality, whether you’re watching native 4K video or low-resolution smartphone clips. The newly-developed XCA8-4K chip upscales HD (or lower resolution) images by analyzing and refining images from all sources. Everything you see is restored with beautiful, natural detail, richer color and stunning contrast. The latest Reality Creation database and Super Resolution processing breathes new life into everything you see with phenomenal 4K (3,840 x 2,160) resolution.”

See Sony’s information or visit Atlas Audio Visual in Victoria to see the only model on display in Canada.

Lytro light field camera raises possibilities

At first glance the Lytro light field camera seems underwhelming.

It’s a tiny spyglass camera that takes square 1080 x 1080 photos. And costs $400 to $500. There’s limited control and no video….

But the technology behind it is kinda cool. It captures all the rays of light bouncing off of everything in front of the lens. Which means you can focus later. And change it forever, after freezing the moment in time. Lytro calls these ‘living pictures’.

Here’s how I envision using this camera:

  • Interactive ‘stories’ composed of a series of living photos that tell a narrative. Each image would be carefully designed with two areas of interest and the viewer would ‘pull focus’ from the first to the second.

But four more ideas come to mind, when I push this tech forward:

  1. Imagine when time and mono sound can be added to this mix. The viewer will be able to refocus different areas in the shot as it unfolds over time. This would be a ‘living movie’.
  2. What if enough data could be recorded to allow the viewer to change their point of view within the shot? This would be the ‘living picturescape’.
  3. Imagine a similar device to capture the ‘sound field’. A sound field recorder would work very similarly to the Lytro. It would capture all the sound waves bouncing off of objects in earshot. The user would then be able to navigate through the soundscape, in essence moving the microphone closer to the sound source they want to hear.
  4. Now imagine a combination of both devices: a living movie with a soundscape microphone — what I’m calling the AVscape. Now that’s getting close to true virtual reality!

Play with Lytro images. Click on an out-of-focus area. Neat!