Week 7 (Feb. 27) ALGORITHMIC VISION

IN CLASS

Viewing of ALGORITHMIC VISION exercises.

Discussion of assigned articles around machine vision.

EXERCISE: DECORRELATED VISION

Video & Sound, 2 minutes

This exercise will require at least 2 cameras (phones), with no limit to the amount of cameras used.

1. An action will be filmed by multiple cameras running simultaneously.

2. These images will be overlaid (one on top of the other) and synchronized. Use opacity controls within your software to overlay these tracks.

3, These cameras should not be static, but moving.

4. You will need some form of synching mechanism, such as a conventional “slate” or loud sound, captured by both cameras so you can ensure synch between the tracks.

EX: You are performing an action with a camera strapped to your body AND you are being filmed performing this action by another (or several) other camera(s).

EX: You are filming an action you are not performing with multiple cameras.

RESEARCH ON MACHINE VISION

Research the machine vision-related link you have been assigned: come prepared to talk about its context, implications, modes of operation etc.

______________________________

MACHINE VISION & PANOPTICS

1. MACHINE VISION

Machinic contact lenses!

Alien / Nonhuman Perspectives

How does a machine see?
How do machines complicate human vision? (We have always been cyborgs!)
“The point here is that if we want to understand the invisible world of machine-machine visual culture, we need to unlearn how to see like humans.” (Paglen)
“We no longer look at images–images look at us. They no longer simply represent things, but actively intervene in everyday life. ” (Trevor PaglenInvisible Images: Your Pictures Are Looking At You)

Dziga VERTOVMan with a Movie Camera (1929)

film can GO ANYWHERE

PANOPTIC – anticipates ubiquitous surveillance

KINOEYE – KINOK (already a cyborgian idea)

separation from theatre and literature / no dialogue / no actors / no “storyline”

fast pacing (for the time)! 1800 shots! (Critique: “The producer, Dziga Vertov, does not take into consideration the fact that the human eye fixes for a certain space of time that which holds the attention.”)

Michael SNOW—La Région Centrale (1971) and the original arm (DE LA)

Eva KOCH - Evergreen (2006)

Eric CAZDYNThe Blindspot VariationsReconfigured Participation (2015)

the fields of view of each camera don’t neatly stitch together / gaps appear in which things still happen

WEBCAM

Dariusz KOWALSKIOptical Vacuum (2008) - excerpt here

Joana MOLLAZ: Move and Get Shot (2016)

ANIMAL CAMS

Seagull Stole My GoPro

Sheepview360

Mr. Lee (CatCam)

MACHINE TIME

The Pirate Cinema

The hidden activity and geography of real-time peer-to-peer file sharing via BitTorrent is revealed in The Pirate Cinema an online piece by Nicolas Maigret. In this monitoring room, omnipresent telecommunications surveillance gains a global face, as the program plunders the core of restless activity online, revealing how visual media is consumed and disseminated across the globe. This live work produces an arbitrary mash-up of the BitTorrent files being exchanged in real time, based on the traffic of the Pirate Bay’s top100 videos. These fragmentary contents in transit are monitored, transforming BitTorrent network users (unknown to them) into contributors to an endless audio-visual composition.

(Detailed presentation)

BULLET TIME

TEMPORAL TO SPATIAL (following the underlying logic ot the digital, of the database)

MATRIX SCENE

stills converted into movie frames: the filmmakers are able, as Alexander Galloway puts it, “to freeze and rotate a scene within the stream of time,” and to view the scene, at each moment, from any desired angle

BUT, still inscribed into a linear, temporal narrative

SPLITTING THE ATOM (Massive Attack, dir. Edouard Salier)

(see Shaviro)

like a computer game in some way - though the music temporalizes it

doesn’t employ montage (moving the camera and fixing the world – the entire space is given in advance)

Instead of time as “inner sense,” we now have an exterior time, one entirely separate from the time-that-fails-to-pass within the video’s rendered space.

VR (enhanced experience: cyborg)

Harun FarockiSerious Games III: Immersion (2009) excerpt here and here and here

The Philosophy of VR (David Chalmers)

How Filmmakers Push Your Eyes Around the Screen At Will

modeling of human behavior in tandem with modeling AI behavior (both evolving simultaneously)

eyes transition from exploratory mode (overall) to information-extraction mode (details filled in)

In addition to this issue, this herbal remedy can cure different issues related to menstruation and reproductive getting viagra health in women. ED is canada cialis from also responsible for collapse of several relationships. If Propecia is taken by a pregnant women, it can cause allergic reaction in some viagra tablets india robertrobb.com individuals. Prescription related to the methods to take generic online viagra pharmacy and then expect an immediate erection. remember SHOW SHHOORRTY a few weeks ago? (exploratory to info-extraction)

“focal points”

“motion onsets” capture attention

ongoing activity parsed into discrete events (beginning of new shot initiates new exploratory phase)

COST CUTTING!!!: perceptual technics (like MP3 – saving bandwidth / money on regions that perception discards)

SURVEILLANCE—EXPANDED PANOPTICS

Surveillance Camera Players

Harun FarockiI Thought I Was Seeing Convicts (2000)

Informatic Opacity (Zach Blas)

PRISM: making visible / informatic visibilities

PRISM transparently mediates light – TRANSPARENCY

construction of models to evaluate against = HUMAN ALL TOO HUMAN

human fully knowable (quantified self) – identity reduced to aggregates of quantifiable data

minoritarian persons not measurable (dark skin undetectable)

OPACITY as resistance – TOR network – as mutated QUEERNESS – subverting identification standardization

These are withdraws from power through collective stylings but also occupations of zones that lie outside the perceptual registers of control. Informatic opacity, then, is not about simply being unseen, disappearing, or invisible, but rather about creating autonomous visibilities, which are trainings in difference and transformation.

Manifesto for CCTV Filmmakers

DARPA Sponsors Surveillance Technology to Predict Future Behavior (2012)

an artificial intelligence system that can watch and predict what a person will ‘likely’ do in the future using specially programmed software designed to analyze various real-time video surveillance feeds. The system can automatically identify and notify officials if it recognized that an action is not permitted, detecting what is described as anomalous behaviors.”

“VIDEO SUMMARIZING TECHNOLOGIES”

DARPA is building a drone to provide ‘persistent’ surveillance virtually anywhere in the world

CORRELATION of surveillance: photos on Facebook that never go away – and the better AI’s get at trawling and collating data, the more of your past becomes subject to examination, the more your own history is rewritten

LAWSUIT asking for Google to remove news stories: Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (2014), a decision by the Court of Justice of the European Union (CJEU).

your past never stops haunting you

1. individualization / differentiation based on a wealth of factors

2. reification of those categories, removing ambiguity so these metadata profiles can be operationalized

amplifies diversity of METADATA signatures in order to capitalize on smaller and smaller domains of everyday life

GENERATIVE VIDEO: MACHINE LEARNING

generalizations + prediction

this happens all the time with the images your phone takes (comparison with past photos — the past does determine the future!!!) (See Hito SteyerlProxy Politics)

the world is reproduced through computational models – including bias, prejudice

continues to reflect bias (Try Googling “three white teenagers” and  ”three black teenagers”)

(underrepresentation in datasets of certain populations)

A beauty contest was judged by AI and the robots didn’t like dark skin

Tay, the racist AI bot

GOOGLE: Algorithms trained on white people interpret black people as GORILLAS

FACEBOOK ELIMINATES HUMAN EDITORS (since rescinded)

hard for the average individual to intervene in

everyday Youtube navigation is training YouTube algorithms

Fei-Fei LiHow We’re Training Computers to Understand Pictures (see errors at 14:30)

INTERESTING.JPG (a project by the University of Toronto’s Deep Learning Group)

Blade Runner Neural Net Reconstruction AND more technical details

Autoencoding Video Frames

machine encoded video mistaken for the real thing by another algorithm

two stages: compression (encoding) - decompression (reconstruction): teaching an artificial neural net to do this without human parameter decision

ADVERSARIAL NETWORK (noise / signal adjustments through feedback cycles)

seeing the film THROUGH the neural network

using the BLADE RUNNER model to reconstruct other films!

Machine Predictive Video : Deep Learning Program Hallucinates Videos and Generating Videos with Scene Dynamics

takes advantage of huge, unprecedented databases of online videos

GENERATOR network <—> DISCRIMINATOR network (stand-in for a human viewer)

Hito SteyerlA Sea of Data: Apophenia and Pattern (Mis-)Recognition

Hito SteyerlProxy Politics: Signal and Noise

“computational photography”—interfaced with all kinds of systems / disabling / altering ECOLOGY

speculative and relational

makes seeing unforeseen things more difficult (novelty?)

who decides on signal vs. noise? (noise = what one doesn’t want to hear —> leads to vertical class hierarchies)

rendering cognition, behavior quantifiable and commensurable to a system of exchange value based in data

TWITTER BOTS—your picture becomes autonomous (detached from its body, context)

ENHANCEMENT, CSI STYLE: GOOGLE Super Resolution Zoom Enhance

Fake Images are Getting Harder and Harder to Detect

manipulated images distort the historical record

FORENSIC TESTS These tests range from analyzing the position and shape of people’s irises in photographs to whether or not the sources of light in a photograph are consistent for the entire image.

but these detection tests can also be used constructively!!!

Leave a Reply