The Glory

Screenshot from Royal Wood's THe GLory
My St Enoch Square Motion Tracking pieces remain one of the biggest projects I’ve worked on, and I’m really pleased that an updated version has now been used in the video for Royal Wood’s The Glory, directed by Adam Makarenko.

You can see the effect in action around the 2:35 mark.

Adam contacted me a few months ago about using the program, and I made some updates to it, adding things like a functioning interface(!) and support for different image sizes(!!). I also refactored the Voronoi/Delaunay code to use ToxicLibs which made it much more stable, and updating it for Processing 2.0 made a big increase in speed. I’m hoping to have an opportunity to develop it further at some point- using shaders for some if the image processing should make it faster- and a proper file loader and preset system would make it much more useable.

Screenshot from Royal Wood's THe GLory

It’s really great to see what someone with a bit more artistic vision can do with tools that you’ve made, so thanks to Adam for the opportunity and the great work.

The joys of alpha testing

I know I’ve been away for a while, but you’ll no doubt be glad to know I have a selection of interesting stuff lined up (which I’ve been meaning to write up for ages) as I put together my new portfolio site.

First up, everyone loves a bit of glitch art. I’m a particular fan of GlitchBot myself. These pictures came about as a result of me mucking about with masking a photo using gaussian distributions. This is broadly the result I was going for:
eye glitch1
Thanks to a strange edge case, where an alpha version of Processing 2.0, the crappy Intel integrated graphics on my laptop and not calling background() during the draw() loop collided, I got stuff like this:
eye glitch2
Continue reading

HD Movement Tracking: further and final iteration

Well, the end of year show has come and gone, and all that remains is the write up. Here’s a quick run down of the work that I showed and some of the development that went into it. I’ll also show the code I cobbled together from other peoples’ code wrote to do it. If you’ve not seen it already, you might want to take a look at the first and second posts that show the earlier stages. Done? Onwards!
Continue reading

Shiny: Additive blending with OpenGL in Processing

This sketch was inspired by a combination of things: the particle systems chapter draft from Dan Shiffman’s forthcoming Nature Of Code book influenced the additive blending aesthetic, while I got the idea of a three dimensional “colour space” from this talk from Mario Klingemann.

All that’s really going on here is the RGB/HSB values of each pixel of an image are mapped to XYZ coordinates, while the camera rotates round the centre point. Changing the mode from RGB to HSB creates a different shape from the same collection of pixels, while the low opacity and OpenGL blending create a nice glowing effect. It’s interesting to see the connections between shades in an image- almost always a continuous spectrum without large gaps.
Continue reading

Sunflow and Processing: the basics

Sunflow is an open source ray tracing renderer which can produce some astonishing results in the right hands. Someone far cleverer than me wrote a Java wrapper for it (the catchily titled SunflowAPIAPI), and another did a tutorial about getting it talking nicely to Processing, which I relied on heavily in getting this working. There is also a Processing library by the same author (the even catchier P5SunflowAPIAPI) but thus far I’ve not been able to get it to do what I want.

Amnon’s post goes into a bit of detail about getting Sunflow APIAPI reading complex geometry from Processing using ToxicLibs- this was my first time using ToxicLibs but it was relatively straightforward. I wrote a simple class to generate some semi-random geometry using ToxicLibs’ TriangleMesh and a couple of lines of code in that prepare it to be passed to Sunflow. In the main sketch I put all the Sunflow calls (setting up the lights, shaders, camera, etc.) in one function which can be triggered by a keypress. This means the sketch is mostly the same as it would be without Sunflow, and can use the OpenGL renderer to view the scene before raytracing- the sketch and the rendering are almost totally separated. I’m not sure if that is possible with the P5SunflowAPIAPI library, or with more complex geometry.

So, to my results…
Continue reading

New Processing Sketches: A Video

Hello, and a somewhat belated happy new year! I hope 2011 has been good to you so far. I’ve been pretty busy both with official college work and personal projects, and it’s the latter I want to show today. I put together a wee compilation of some of the sketches I’ve put together recently as a “showreel” of sorts (with one eye on interviewing for university in the immediate future). Some of these aren’t really suitable for web deployment, and doing it as video lets me crank up the detail and quality. It also gives me the opportunity to make some metal to go behind it.

Continue reading

Processing and The Guardian API- now with actual information

2010- a year in Wikileaks

Here’s a quick snapshot of how this is developing. This searches the Guardian’s Open Platform API for mentions of everyone’s favourite whistleblowing website. The bars map the number of articles on a monthly basis, where 12 o’clock is January. You can see a small peak in April, when the Collateral Murder video was released, bigger peaks in July and October as the Afghanistan and Iraq logs are published and a massive spike in December as “Cablegate” (oh, how I loathe the use of ‘gate’ as a suffix for anything mildly controversial!) gets going. The article headlines are arranged by date order, but on a uniform scale. This is still work in progress, but I’m quite pleased with how it’s shaping up so far.

Get in touch with any comments, criticisms or questions!

Processing and the Guardian: Now 73% more object oriented, 300% more colourful

Hello! Just a quick update: following on from my last post I’ve refined the code a bit, letting me run multiple searches from one sketch. Here’s the same three searches from last time, compiled into one image. In this case, Tony is green, Gordon is red and Dave is blue.
Code will be forthcoming once I’ve refined it a bit more. AdiĆ³s!

Processing and the Guardian API

Inspired by this article from the awesome Jer “Blprnt” Thorp, I’ve been experimenting with the Guardian’s Open Platform API, which gives access to ten years worth of articles in XML or JSON format. You have to sign up for an API key but it’s free and easy. I thought I’d put up some of the early tests I’ve been doing with it. I’ve never worked with XML before so it’s been something of a learning experience!
Continue reading

Three Flash Pieces

OK, now it’s time for the final instalment of the Nine Words saga that has been ongoing for a while. This time, the brief was to create three interactive pieces using Flash, triggered by words chosen from the nine. My AS3 programming is not very advanced so I’ve not been able to get as conceptual as I did with the Processing pieces, but so it goes. All three rely to varying degrees on the rather nice Hype Framework, which simplifies some aspects of AS3 to let you get going a bit more easily. Click on the pictures to play with the pieces.

Continue reading