Hunting a Snark? | Rudi Knoops

Icon

exploring uncharted terrain on the New Media map (after Lewis Carroll)

Flocking behaviour & influence maps

I’ve been exploring  different scenario’s how to in a controlled way use randomness  in the large interactive wall for the museum installation. Randomness and control? Indeed,  there is always some kind (and degree) of  control when you integrate randomness.
I’ve been experimenting with the ‘Assembler’ code  from yaief.wordpress.com/2008/12/09/fun-with-flash-assembler/. One possible idea was to use treelike structures as background for each of the 12 elements of the installation:


influencing Flocking behaviour using image maps from Rudi Knoops on Vimeo.

This video shows the output of the Flash experiment where an influence map is used to control the flocking behaviour of particles. More specifically:  the particles line up in the form of the influence map used: a treelike structure.

When using 2 such ‘elements’ within one Flash project, it became immediately clear that this was too CPU intensive. Pity.

Conclusion: interesting technology, but not usable for the project I’m working on now.

motion tracking in processing

I’m currently building a playful Processing application for the pre-opening of the new Media & Design Academy building at C-Mine. The pre-opening is scheduled for the weekend of april 25-26 2009. So I have quite some time left to finetune the draft version I have now.

using my old phone as trigger

using my phone as trigger, the red square outlining the blob

Concept = The mirrored video image of people passing in a specific frame is projected lifesize. When you stand still in the frame, you just have the projection of your mirror image. Movement will reveal an extra layer.  Using motion detection, particles of ribbons that follow the movement in the frame are superimposed upon the videosource.

Curious how people will interact with this.

For the time being it’s based on motion tracking as provided in the JMyron processing library.

I took out the visualisation of the blobs (well not yet everything, as keeping some visual reference in an unfinished product is a real time saver while still in production. I just have to outcomment a few lines of code when completely finished.)

And I added:

  • Mirroring of the video input image
  • the 2D ribbon code graceously provided by James Alliban, which I linked to the motion tracking output.

Still to implement:

  • Mirroring of the motion tracking output.
  • Limiting the amount of blobs.  (based on luminance levels, well, in fact on colour)

What I have now works best in a dark setting: writing particles using a light source (my mobile phone).

If you browse James Alliban’s blog, you’ll notice that I got my inspiration from his virtual ribbons experiment. I’m only using his 2D ribbon code though, but combining this with other chunks of code into a real application, forces me to delve deeper into Processing, and will probably prove to be quite a learning experience.

Will be continued…

 

June 2017
M T W T F S S
« Sep    
 1234
567891011
12131415161718
19202122232425
2627282930  
I currently have a PhD fellowship at KU Leuven, Associated Faculty of the Arts. This blog documents my PhD research, where I explore the workings of cylindrical anamorphosis in audiovisual media. My practice based arts research shows an evolution towards installation-based works.