U2 shows us the way with live mobile streaming

First Youtube enabled anyone to post moving images to the Internet, democratizing the movies.

Now mobile streaming apps are revolutionizing live broadcasting, once the domain of television.

Having just launched within the last three months, both Meerkat and Periscope enable anyone with a smartphone to stream live video broadcasts in realtime to the world.

Meerkat (IOS and Android) wants you to first log in to Twitter. The left column lists upcoming streams, comments are on the right and the stream is featured vertically in the middle. Meerkat loves the colour yellow.

Periscope (IOS and Adroid) was purchased by Twitter shortly after Meerkat debuted. Comments are superimposed in the bottom left-hand corner, and you can show some ‘love’ with hearts that float up the right side of the vertical screen.

You can search Twitter to find live Meerkat streams or live Periscope streams.

Or, New York digital & social agency, GLOW, offers two ways to sample multiple streams:

Rock band U2 have embraced Meerkat. During the current i+e Tour, according to The Hollywood Reporter,

“The band invites an audience member onto the B stage to shoot a stripped-down number — on this night, ‘Angel of Harlem’ — to be broadcast live via the fledgling Meerkat platform. ‘This goes out across the globe — to about 150 people, until it catches on,’ Bono quipped.”

My take: I think this is truly revolutionary. The ‘airwaves’ for traditional TV broadcasters are strictly controlled by the FCC in America and the CRTC in Canada. Now, everyone with a smartphone has a ‘TV’ camera in their pocket and can begin broadcasting to the world at any time, for free! Journalism and entertainment may never be the same again. Interestingly, both apps use a mobile-friendly vertical orientation, which is decidedly uncinematic.

Disney posits computer-aided editing

Disney researchers are working on an editing algorithm that edits footage from multiple cameras into coherent narratives.

It maps the common attention point in space for all the cameras as a proxy for the common subject. It then applies editing rules such as the 180 degree rule, jump cut avoidance and cutting on action — things your editor does now.

Their video is convincing.

See their take on interactive synchronization as well.

My take: this would be fascinating to see applied to news or documentary footage. It might also be applied to down-and-dirty multi-camera narrative work. The editor of the future’s job might evolve into finessing these cuts, choosing appropriate cutaways and organizing the order of scenes.

Hyperlapse solves shaky time-lapse footage

Fascinating news from Microsoft Research: we can fix your shaky GoPro time-lapse footage.

The technique is called Hyperlapse and they say an application is coming soon, perhaps in a few months.

Better yet, they show you how it’s done.

My take: thanks for sharing, Microsoft. I love that you’ve released the technical know-how as well. I predict a Google Street View-style multiple camera rig capturing overlapping footage to generate a rich ‘picture-scape’ combined with this software to create immersive, real-time, viewer-defined camera movement. Movies, meet video games.

Super 8 is about to make a comeback

The consumer film format called Super 8 was dominant in the sixties, seventies and eighties until the upstart technology called ‘video’ challenged it in the nineties and vanquished it from the marketplace in the new millenium. HD video now rules. With the right lens use and lighting, we can shoot economical, cinematic images.

Nevertheless, are you nostalgic for the real film look? It’s too expensive to actually shoot on film, right? 35mm, even 16mm, is out of reach. But what about Super 8? Is it possible to shoot on Super 8 and transfer to video for post?

My memory of the look of Super 8 is slightly soft, jittery Kodak Kodachrome, with it’s very warm tone and super-saturated reds. I shot my first films on Super 8, physically splicing the shots together and projecting the original reversal stock which would jump slightly as the cuts chattered through the projector gate.

One of Super 8’s strengths was also one of its weaknesses. The cartridges were extremely user-friendly but their design meant that the film was held steady during exposure by a simple pressure plate. Jitter, therefore, was built into all Super 8 cameras.

Now, a Danish company called Logmar plans to re-engineer the Super 8 camera. Their idea is to pull the film out of the cartridge and pin register it during exposure. The footage is rock-steady.

What about Super 8 film and developing? North American rights, film and processing will be handled by Pro8mm of Burbank, California.

My take: at 5 grand, this will be an expensive camera. I love the modern technology Logmar is brining to a mid-century medium, like the digital monitor and SD sound recording. Neat that they can scale this up to 16mm and 35mm as well. And I love the discipline of film versus video. But film! I thought it was dead! That sample footage does look more like 16mm than the Super 8 I remember. Perhaps if they address the dust on the negative and the dirt in the gate by the frame lines….

‘Sharknado 2’ can control your lights

Whether or not you appreciated the concept behind last year’s ‘Sharknado’ — sharks falling out of the sky — you should appreciate a cool technological tie-in tonight’s outing brings. (Syfy, Space 9 e, 6 p)

According to Mashable, ‘Sharknado 2: The Second One’ — now set in New York instead of LA — will be able to control your lights. Think flashing during lightning and drenching your room in red during the shark attacks.

That is, if you have Philips hue lights. Combining LED lights, the Internet and smart phone control gives you ‘personal wireless lighting.’

Plus, you need the Syfy Sync app on your smart device.

“The secret sauce of the whole experience is the Syfy Sync app, which typically brings the viewer second-screen information, such as actor profiles and trivia. Similar to Shazam, the app uses audio tagging to identify what the viewer is watching, delivering the right content at the right moment. But the Philips integration takes it to another level.”

My take: kinda cool. With 5.1 sound and responsive lighting, in the proper hands, this could make for very immersive experiences.

A brave new world envisioned by — Disney

The Washington Post reports that Disney is working on a touchscreen that lets you feel textures.

“The technology is called ‘tactile rendering of 3D features,’ and an early version of a rendering algorithm has already been developed by engineers at Disney Research in Pittsburgh. The process behind it is, predictably, both technical and confusing, but the basic premise is that small, electronic pulses can trick your fingers into perceiving bumps and texture, even if the surface is actually flat.”

The right amount of voltage makes you feel ridges, edges, protrusions or bumps!

Check out the video.

My take: This reminds me of the feelies in Aldous Huxley’s Brave New World. Science fact catches up to science fiction!

DIY DCP

If, after reading last week’s post about exhibition formats, you really want a DCP but lack the budget to get one professionally made, you’re in luck.

Danny Lacey has created a tutorial for OpenDCP, open software to create Digital Cinema Packages.

Danny says,

I’ll tell you what, this is going to open a lot of doors for Indie film makers too, I believe it’s going to be incredibly helpful for those going down the self distribution route. It’s quite simply chopping down the prices and expense of delivering your movie.

The 27-minute video has all the details but in a nutshell:

  1. Export your film as a 16-bit TIFF sequence.
  2. Use free, open source DCP software to convert the TIFF sequence into JPEG 2000
  3. The DCP software then wraps the video (JPEG2000) and audio (WAV) in to MXF files.
  4. The final stage is creating the DCP which generates 6 files that will be recognised by a DCP server.

My take: well worth the watch!

Google Glass: everyone is a cinematographer now

According to VentureBeat, the first arrest has been witnessed by Google Glass.

Documentary filmmaker Chris Barrett glassed it in Atlantic City on July 4.

“I picked up my Google Glass explorer edition last week. I wanted to test Glass out, so I filmed some fireworks, getting a very cool first-person perspective. About 10 minutes after the fireworks, we were walking back to our car, and I just decided to try it out on the boardwalk.”

Watch the footage.

My take: welcome to the voyeur world, where everyone is a cinematographer. Right now, the public is unaware and continues to “act natural”. But will behaviour change? Has it changed with CCTV and cellphone video? Very soon, documentaries will look very different when everyone has their own Glass. Editing may be replaced with web-based crowd-sourced Glass-fueled media streams, like Switchcam.

Hyperaudio Pad: cut and paste transcript video editing

One of the problems with digital film is that it’s too damn easy to just shoot and shoot and shoot.

Great! But then, when it’s time to edit, you’re faced with hours and hours and hours of footage.

Just logging it can take days.

A new technology promises to revolutionize this and more: Hyperaudio Pad.

Twenty-two final round winners in the Mozilla Ignite Challenge were just announced. One of them is Hyperaudio Pad by Mark Boas et al.

The creators think they’ve built a language learning tool. I think it’s a whole new way to edit video.

What’s so cool about it?

  • It transcribes footage and automatically makes transcripts.
  • You can cut and paste the words to create visual sequences.

See the web page and the pitch video.

Most importantly, see the application they made for Al Jazeera English. Search for ‘nuclear’ and then click on each of the 35 tiny red and blue squares.

My take: imagine letting Hyperaudio Pad loose on your footage. How great would your next documentary be?

Note: AVID has something similar called ScriptSync.

Get ready for 8K!

On the heels of the massive 4K consumer television released by Sony, the Japanese have done it again.

NHK, Japan’s public broadcaster, will screen the first 8K film in Cannes later this month.

“8K Super Hi-Vision was developed by NHK and delivers a resolution of 7,680 by 4,320 pixels – four times the resolution of 4K and 16 times that of HD – as well as featuring 22.2 multichannel sound.”

Film theatres are just finishing their transition to digital projectors capable of 4K.

My take: I think technology is about to surpass biology; going to see a film in a movie theatre might one day be a sharper and clearer experience than anything you could see in real life. We may never go outside again!