Netflix scores with trippy interactive movie ‘Bandersnatch’

The Black Mirror team have handed Netflix a major win with interactive movie ‘Bandersnatch.’

Netflix has dabbled with interactive titles before, but only for kids. This outing is definitively all-grown-up with drugs, madness and violence.

As summarized on SYFYwire:

“To experience the film, viewers begin watching it like any other program or movie. But as Bandersnatch moves along, a series of choices appear on the screen, roughly every few minutes or so. Using a remote control, console controller, or keyboard, viewers make the decisions for the story’s protagonist, sending the narrative off in any number of new directions.”

Over five hours of completed scenes were shot.

“The branching narratives have been developed through a new Netflix software called Branch Manager, which can also allow viewers to exit and start all over again if they choose. The project is supported on most TVs, game consoles and either Android or iOS devices as long as they’re running the latest version of Netflix.”

How complex is the story map? See this picture or this summary. Warning: contains spoilers.

My take: I love this! The theme of who’s in control is perfect for an interactive movie about a programmer programming an interactive video game. The fact that the movie has been gamified (as the various endings have different point ratings) is hard to miss. Makes my first interactive musings look simple in comparison. I would love to make a Netflix Interactive Movie!

LG announces new Ultra Short Throw 4K projector

Will you still be on Santa’s Nice List in January?

LG will announce its latest CineBeam Laser 4K projector at CES 2019 in Las Vegas within two weeks.

The HU85L uses Ultra Short Throw (UST) technology that lets it project a 90-inch diagonal image when the unit is a mere 2 inches from the wall.

It can project a 120-inch image when placed 7 inches from the wall.

Here’s a review of last year’s model:

My take: Not cheap at something like $3,000 (3K for 4K?) but would mean you could dispense with a TV in your living room.

AI-generated photos now life-like

Tero Karras, Samuli Laine and Timo Aila of Nvidia have just published breakthrough work on Generative Adversarial Networks and images:

“We propose an alternative generator architecture for generative adversarial networks, borrowing from style transfer literature. The new architecture leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images (e.g., freckles, hair), and it enables intuitive, scale-specific control of the synthesis. The new generator improves the state-of-the-art in terms of traditional distribution quality metrics, leads to demonstrably better interpolation properties, and also better disentangles the latent factors of variation.”

I’ve blogged about Nvidia’s GANs and image generation before, but this improvement in quality is remarkable.

If I understand it correctly, the breakthrough is applying one picture as a “style” or filter on another picture. Applying the filters in the left column to the pictures across the top yields the AI-generated pictures in the middle.

Read the scientific paper for full details.

Of course, we’ve seen something similar before. Way back in 1985 Godley & Creme released a music video for their song Cry; the evocative black and white video used analogue wipes and fades to blend a myriad of faces together, predating digital morphing. Here’s a cover version and video remake by Gayngs, including a cameo by Kevin Godley:

My take: Definitely scary. But if that’s the current state of the art, I think it means we are _not_ living in the Simulation — yet, even though Elon Musk says otherwise.

Pirates be warned: Blockchain is on patrol

Arrgh! Today be International Talk Like a Pirate Day so ’tis mighty fittin’ that news out o’ th’ Toronto International Film Festival announces a new tack on film distribution ‘n online piracy.

Canadian post-production companies Red Square Motion ‘n Unstable Ground ‘ave joined wit’ distributor Indiecan Entertainment t’ launch LightVAULT.

Th’ new crew offers an end t’ end solution fer th’ secure holding, quality control, conversion ‘n delivery o’ film assets t’ clients around th’ globe, promisin’ a one-stop solution fer digital storage, protection ‘n delivery o’ film ‘n media treasure.

“The core of the service is designed with protection of content from unauthorized sharing and piracy in mind, by using a blockchain-based forensic encoding technology.”

They harness technology from South Africa’s Custos who have “built a platform designed to incentivise people in criminal communities from all over the world to protect your content.”

“You never want your film to be on The Pirate Bay. You probably don’t want your film being shared on campus networks. You don’t want the guy from the local paper that you asked to review your movie sending it to his friends. We get people from all over the world to anonymously tell us when they find your film where it shouldn’t be. We have blockchain magic, and we will find them.”

How do they do that?

Bounty!

Pasha Patriki of Red Square Motion explains why he’s promoting this technology:

“My own feature film that I Executive Produced and directed (Black Water, starring Jean-Claude Van Damme and Dolph Lundgren), was leaked online several months before it’s official world release date. Since then, I have been researching technologies that could help track every delivery and download of the master files of the film.”

My take: ’tis a smart solution that helps filmmakers keep tabs on thar screeners. ‘Tis prolly worth th’ cost, as th’ price o’ piracy be growin’ in terms o’ lost revenue ‘n compliance. Arrgh!

How to spot deepfakes

Siwei Lyu writing in The Conversation brings attention to “deepfakes” and offers a simple way to spot them.

What’s a deepfake?

From Wikipdeia:

“Deepfake, a portmanteau of “deep learning” and “fake”, is an artificial intelligence-based human image synthesis technique. It is used to combine and superimpose existing images and videos onto source images or videos.”

My favourite technology show, BBC Click, explains it well:

Back to Siwei:

“Because these techniques are so new, people are having trouble telling the difference between real videos and the deepfake videos. My work, with my colleague Ming-Ching Chang and our Ph.D. student Yuezun Li, has found a way to reliably tell real videos from deepfake videos. It’s not a permanent solution, because technology will improve. But it’s a start, and offers hope that computers will be able to help people tell truth from fiction.”

The key?

Blinking.

“Healthy adult humans blink somewhere between every 2 and 10 seconds, and a single blink takes between one-tenth and four-tenths of a second. That’s what would be normal to see in a video of a person talking. But it’s not what happens in many deepfake videos.”

They analyze the rate of blinking to decide the veracity of the video.

“Our work is taking advantage of a flaw in the sort of data available to train deepfake algorithms. To avoid falling prey to a similar flaw, we have trained our system on a large library of images of both open and closed eyes. This method seems to work well, and as a result, we’ve achieved an over 95 percent detection rate.”

My take: Wow! So, basically, now you can no longer believe what you read, hear or see. Interestingly, this means that IRL will take on added value. (Oops, it seems that technology has already moved on: now deepfakes can include blinking.)

U2 explores AR

Irish band U2 has always embraced technology and continues to do so on their latest tour by embracing AR.

AR is Augmented Reality and superimposes information on top of your phone’s camera image.

Fans attending the shows will be able to hold up their phones to reveal a huge iceberg and a virtual singing Bono.

You can download the U2 eXPERIENCE app here. To test drive it, point it at the album cover for Songs of Experience. A virtual cover will float on top of the picture of the cover, shatter into shards as music begins to play and then an animated Bono will begin to sing.

As you move your phone side to side or up and down, you’ll see different angles of the holographic representation.

My take: this is pretty cool and might be many folks’ first experience of AR.

News on the brain front

Having written a feature about a body-less man, I’m always interested in learning about scientific advances concerning brains.

Previously, I’ve written about the Italian maverick planning to transplant a human head.

This week news out of Yale claims American scientists have revived pig brain activity for 36 hours.

Nenad Sestan is quoted as saying:

“That animal brain is not aware of anything, I am very confident of that. Hypothetically, somebody takes this technology, makes it better, and restores someone’s activity. That is restoring a human being. If that person has memory, I would be freaking out completely.”

Read more details here.

My take: this subject gets murky very quickly. Witness the ethical issues the scientists raise in Nature. Another questions whether a brain without stimuli would be torture. Heady stuff.

Features shot on iPhones

First there was Tangerine.

Now there’s Unsane.

Director Steven Soderbergh has revealed that he shot almost all of his latest film on an iPhone 7 Plus.

Jay Pharoah says Soderbergh should have used a Samsung.

I believe Sean Baker actually used an iPhone 5S to shoot Tangerine. Here he spills the full beans.

He mentions the Moondog Labs anamorphic adapter lens and the Filmic Pro app.

My take: basically, lack of a suitable camera is no longer an excuse for not filming. But everything else stays the same, starting with a great script and a smart plan.

Blackmagic 4K Pocket Cinema Camera Announced

As reported by No Film School in their NAB 2018 coverage, Blackmagic Design has just announced the next iteration of their Pocket Cinema Camera: 4K for under $1300 USD.

Shipping in September, according to Jacob Kastrenakes of The Verge:

“The new Pocket Cinema Camera 4K has a ton of features that’ll appeal to that market — like a mini XLR connector, LUT support, and 4K recording at 60 fps — but it still has limitations that’ll keep the camera confined to a niche audience (which, to be fair, is kind of true of every camera). Basically, unless you’re a filmmaker who’s typically in control of lighting and the overall environment they’ll be filming in, this camera probably isn’t for you. It doesn’t have in-body stabilization, and the small sensor will struggle in low light and require adaptors to get the depth of field you’d get from full frame or even Super 35 cameras. That might not matter to some filmmakers, but it could be an issue for people on fast shoots or traveling to unfamiliar locations.”

Here are the specs:

  • Full size 4/3 sized sensor
  • Native 4096 x 2160 resolution
  • 13 stops of dynamic range
  • Up to 25600 ISO
  • Carbon fiber polycarbonate composite body
  • Built in SD, UHS-II and CFast card recorders
  • USB-C expansion port for external SSD or flash disk
  • Features full size HDMI output
  • Professional mini XLR
  • 3.5mm audio jack
  • Built in 5” LCD touchscreen a
  • 3D LUTs
  • 4th generation Blackmagic color science
  • Supports remote camera control via Bluetooth

My take: Wow! There are so many great high-resolution video cameras available right now or in the near future.

Really Big Rollable 4K OLED Screen Announced

LG Display showed off the world’s first and largest rollable 4K OLED screen at CES this year.

As reported by David Pierce in Wired:

“The 65-inch display sits flat and sturdy on your wall, like a normal television, until you’re done with it. With one push of a button, the display descends down into its stand, rolling around a coil like wrapping paper. The screen can roll up completely for safe storage and easy transportation, or you can leave a small section of it sticking up, at which point the screen automatically shifts into a widgetized, information-providing display with weather and sports scores. LG’s device has almost nothing in common with most TVs, other than its size. Functionally, it’s more like a really big tablet.”

Fully unrolled, the aspect ration is 16:9.

But wait, there’s more! It can roll down to 21:9, eliminating the black bars above and below widescreen movies.

My take: I want one! I would hang it upside down from the ceiling, so it would mimic a cinema screen of yore.