Code: ProcessingData library (2.0 Data API for 1.5.1)

Sep 5, 2013 | Code, Libraries, Processing / Java

Code: ProcessingData library Source, Exported library (ZIP)

During my CIID workshop I did a quick hack to make the Processing 2.0 Data API available for 1.5.1 users. The code from the 2.0 core needed only a few minor adaptations. The data methods (like loadTable()) that are native to PApplet in 2.0 are provided through a simple helper class (

The data classes are one of the best features of 2.0, cleanly written and consistent as they are. But if you’re sticking with 1.5.1 out of preference or necessity this might be of use. All glory is owed to the Processing team, I simply repackaged the code. As proof of the quality of the 2.0 code, it took about 30 minutes to extract and refactor the data code. The biggest hurdle was exporting the JAR file and figuring out it had to be compiled to be Java 1.5 compatible

The code can be found on my Teaching repo on GitHub, the exported library can be downloaded as a ZIP file. See the included LoadSaveTable.pde example from Shiffman for a demo, it shows how to use the ProcessingData class to call loadTable() etc.

Disclaimer: Minimal testing was performed, any issues should be reported on GitHub.

, , , ,

Blog theme update

Sep 4, 2013 | News, Web dev

In anticipation of my upcoming ITP activities I have given the blog a much-needed theme update. My chosen poison is the Miniml theme by Leland Fiegel, hacked and dismembered at will to suit my purposes.

Updates are prone to breaking things, so any feedback and error reports will be appreciated. Really.


Teaching: Parametric design + digital fabrication at NYU ITP, Fall 2013

Aug 30, 2013 | News, Processing / Java, Workshops

This fall semester I will be teaching a 14-week class on Parametric Design and Digital Fabrication at NYU ITP, focusing on generative strategies for the creation of fabbed artifacts and structures. The use of the word “design” in the class description is here intended in an agnostic sense, referring to a process of creation rather than a specific creative canon or market economics.

ITP is well-known as a vital breeding ground for media artists and interaction designers, with a great student base and an even more stellar cast of faculty members. It is also the home base of the inimitable Dan Shiffman, thus playing a role in supporting the development of Processing 2.0.

In addition to teaching the class I will also be an ITP resident, which will allow me to do research and get involved with other ITP projects. I’m excited that this is officially happening, as previous plans had to be aborted due to my incomplete immigration status.

Class info + Github repo


, , , , , , , ,

Code: CircleLerp.pde (++ thoughts on math)

Aug 18, 2013 | Code, Processing / Java, Workshops


CircleLerp.pde, posted on

Aug 22,2013 Update: Dave Bollinger and Frederik Vanhoutte responded to my post with code improvements and alternate solutions. I’ve already updated my code as suggested, but I also recommend reading this old thread on the Processing Forum. In it, Dave proposes a short and sweet wrapLerp() function that can deal with any situation where wrap-around interpolation might be required:

float wrapLerp(float a, float b, float t, float w) {
a += (abs(b-a) > w/2f) ? ((a < b) ? w : -w) : 0;
return lerp(a, b, t);

Where "w" is the range of the given interval to be wrapped, i.e. w=1 for a normalized interval and w=TWO_PI for a full circle given in radians.


, , , , , , ,

You Are Big Data: CIID Summer School

Jul 15, 2013 | Open source, Processing / Java, Theory, Workshops


Screenshot: Manic Time, a particularly obsessive time tracking app

The following is a summary of tools and resources for my two week “You Are Big Data” workshop for CIID Summer School in Copenhagen, in which we’ll be dealing with Quantified Self and data sculpture. This is in part a repost of a previous list.

Andy Polaine wrote a post that referenced my previous summary, in which he made some good critical points and provided a link to a tool I was unaware of: Slogger by Brett Terpstra (sadly, I don’t have a MacOS / Linux setup for testing these kinds of apps myself.)


, , , , , , , ,

Workshop NYC, June 30: Processing.js and JS for Processing users

Jun 19, 2013 | Code, News, Processing / Java, Workshops

Workshop: Processing.js and JS for Processing users
Sun June 30th, Park Slope, NYC

In this workshop participants will learn how to apply the power of Processing to web environments with Processing.js (PJS). An ingenious port of the core Processing API to Javascript and HTML5, PJS is the kind of tool that would have been all but unimaginable just a few years ago.

Possible applications include code-based animation, interactive visuals and data visualization, presented as web-native media experiences viewable by a mass audience online and on mobile devices. Our focus will be on creating generative visuals in PJS, aided in part by my new Modelbuilder.js, developed for just that purpose and already a valuable tool for my own JS projects.

If you’re curious about Processing.js or would like to apply your Processing skills to the creation of code-based content for the web, this workshop will give you a head-start by providing you with practical real-world techniques, a few flashy effects and the foundation for long-term survival skills.

A full workshop breakdown can be found below, followed by practical information. The workshop breakdown is somewhat on the verbose side, as it provided me with a way to think out loud while planning the workshop. As a non-JS native, I’m still figuring out some of the conceptual implications of the shift from Java to JS.

But rest assured, my workshops are about making things, not pondering the finer points of Computer Science theory.

Suitable for: Processing coders of all levels. Some knowledge of HTML, CSS and Javascript will be helpful but not required. As preparation I would suggest reading the Quick Start – Processing Developer and Quick Start – JavaScript Developer guides from


, , , , , , ,

Workshop NYC, June 29: Advanced – Geometry and Animation in Processing

Jun 15, 2013 | Code, Libraries, Processing / Java, Workshops

From the Catenary Madness series (created with Toxiclibs, see code on OpenProcessing)

Workshop: Advanced Processing – Geometry and animation
Sat June 29th, Park Slope, NYC

Processing is a great tool for producing complex and compelling visuals, but computational geometry can be a challenge for many coders because of its unfamiliar logic and reliance on mathematics. In this workshop we’ll break down some of the underlying principles, making them more comprehensible and showing that we can create amazing output while relying on a set of relatively simple techniques.

Participants will learn advanced strategies for creating generative visuals and motion in 2D/3D. This will include how to describe particle systems and generating 3D mesh geometry, as well as useful techniques for code-based animation and kinetic behaviors. We will use the power of libraries like Modelbuilder and Toxiclibs, not just as convenient workhorses but as providers of useful conceptual approaches.

The workshop will culminate in the step-by-step recreation of the Catenary Madness piece shown above, featuring a dynamic mesh animated by physics simulation and shaded with vertex-by-vertex coloring. For that demo we’ll be integrating Modelbuilder and Toxiclibs to get the best of worlds.

Suitable for: Intermediate to advanced. Participants should be familiar with Processing or have previous coding experience allowing them to understand the syntax. Creating geometry means relying on vectors and simple trigonometry as building blocks, so some math is unavoidable. I recommend that participants prepare by going through Shiffman’s excellent Nature of Code chapter on vectors) and Ira Greenberg’s tutorial on trig.

Practical information

Venue + workshop details: My apartment in Park Slope, Brooklyn. Workshops run from 10am to 5pm, with a 1 hour break for lunch (not included). Workshops have a maximum of 6 participants, keeping them nice and intimate.

Price: $180 for artists and freelancers, $250 for agency professionals. Students (incl. recent graduates) and repeat visitors enjoy a $30 discount.

Price: $180 for artists and freelancers, $250 for design professionals and institutionally affiliated academics. Students (incl. recent graduates) and repeat visitors enjoy a $30 discount. The price scale works by the honor system and there is no need to justify your decision.

Basically, if you’re looking to gainfully apply the material I teach in the commercial world or enjoy a level of financial stability not shared by independent artists like myself, please consider paying the higher price. In doing so you are supporting the basic research that is a large part of my practice, producing knowledge and tools I invariably share by teaching and publishing code. It’s still reasonable compared to most commercial training, plus you might just get your workplace to pay the bill.

Booking: To book a spot on a workshop please email with your name, address and cell phone # as well as the name of the workshop you’re interested in. If you’re able to pay the higher price level please indicate that in your email. You will be sent a PayPal URL where you can complete your payment.

Attendance is confirmed once payment is received. Keep in mind that there is a limited number of seats on each workshop.

, , , , , ,

Processing 2.0 released (links + notes)

Jun 6, 2013 | Code, Libraries, Processing / Java

Processing 2.0 is out of beta, the release version dropped last night and can be downloaded from (with the option to donate to the Processing Foundation to help further develoment.) Congratulations are in order to the whole team behind the new version!

Some links pertinent to the new release:

Personally, I look forward to an end to the proliferation of library incompatibilites, which has become a bit of a problem since developers starting migrating to the 2.0b code base. I’m still using the Processing 1.5.1 code base with Eclipse for current projects, having no immediate need for shaders etc. I use Processing as a professional production tool, so it’s mission critical for me to have a stable and predictable workflow.

That can be a surprisingly tricky proposition once you add dependencies on various libraries into the mix. Keeping track of which library releases play well together is a must. (Pro tip: Make sure you note which libraries and which versions are used for a given project, preferably bundling backups with project code for future compatibility.) Fortunately, code for various Processing versions can be downloaded (using tags) on the Processing GitHub repo.

I’m curious to see if the final versions of PShapeOpenGL etc. can resolve any of the issues I brought up recently about geometry data. A quick look reveals that PShape and PGraphics still don’t support PVector as a basic data unit for calls like vertex() etc., which does not give developers any real incentive to adopt PVector. I would have loved to see vertex() expanded to accept PVectors, even better, PVector arrays and ArrayLists. But I guess not doing so keeps the core code as clean as possible.

Access to the internal data of PShape and the Processing rendering engine is still limited, perhaps not surprising given that their internal optimized representations can be somewhat arcane. I recently “discovered” the existence of the useful PStyle class (along with push/popStyle()), I don’t know why I never noticed their existence before. Classes describing Gradient drawing styles can be found in PShapeSVG, but they’re clearly not intended for public usage.

Final analysis: I’m hopeful about PShapeOpenGL and I’m sure that using it will offer performance optimization (for static models, especially.) But I’m curious whether users will bother with it unless it is adopted by Toxiclibs and other libraries commonly used to generate the kind of complex geometry that would benefit from it.

I will take a shot at writing a translation mechanism between Modelbuilder’s UGeometry class and PShapeOpenGL, that should be a good test for how well it lends itself to real-world use.

, , , ,

Thoughts on PVector and data exchange between Processing geometry libraries

May 19, 2013 | Code, Open source, Processing / Java

Update: I just closed the Github thread, apologies for wasting anyone’s time. Given the mixed reactions on Github I have decided to leave the issue for greater minds to ponder.

I still think that standardization of low-level geometry data would be a good thing, promoting interoperability between libraries for that much more awesome. More importantly, it would provide a familiar class framework for users, lessening the confusion slash tedium of learning new data structures for every new library (not to mention writing boilerplate code to convert data to pass between libraries.)

Let’s face it, computational geometry is hard. But it’s also an essential element of generative systems and computational design. Anything that helps users ease into the world of vectors, vertices and meshes is surely a good thing.


I just posted a thread for discussion on the Processing Github repo:
Thoughts on PVector and data exchange between Processing geometry libraries.

For a while now I’ve been frustrated by the lack of universal and portable data structures for vector and geometry data in Processing. Using PVector sounds nice in theory, but the reality is that geometry libraries like Hemesh, Toxiclibs, Geomerative and Modelbuilder all rely on their own custom representations of vector and mesh data. The result is incompatible data structures with few if any methods for universal data exchange. Worse, PVector is largely ignored and almost never seen “in the wild”

There are many good reasons for this proliferation of incompatible code, most significantly developer preference and the overall focus and internal logic of any given library. But having had the occasional pleasure of integrating both Toxiclibs and Hemesh with my own Modelbuilder, it strikes me that a standard interface would benefit both developers and users. Translating UVec3 objects to Toxiclibs’ Vec3D isn’t all that difficult, but it is tedious. And going beyond simple passing of vector data to more ambitious structures like meshes with per-vertex shading is a headache.

My proposal would be a minimal Java interface (called PVertex, perhaps) representing vertex data (x,y,z + ARGB color + UV texture coordinates). Custom vector classes would implement this interface, guaranteeing interopability but leaving developers free to choose any further implementation details. An interace should be minimally intrusive, but would be very helpful in encouraging geometry data exhcange. (In an ideal world I’d also love to see a minimal mesh container interface (PMesh?) represented by a PVertex array plus an integer array of vertex ID triplets. But, hey, I’m a dreamer.)

If any of this piques your interest, I suggest you go follow (and maybe participate in) the GitHub thread. Karsten Schmidt and Frederik Vanhoutte have already given some valuable feedback. Ultimately, this isn’t so much a discussion about the Processing core as it is an attempt to get library coders to agree on some minimal conventions.

, , , , ,

Code: Catenary Madness

Apr 25, 2013 | Code, Flickr, Open source, Processing / Java, Watz work

OpenProcessing - catenary_mwTweak03 4-up 02

Marius Watz: Catenary Madness (Flickr sketches / Code on

In case you noticed the Flickr image flood last week of a series of fairly wild-looking sketches titled “Catenary Madness”, I just got around to posting that code on, it even has a few bells and whistles on it: catenary_mwTweak03.

The origin of this series was very random, the original impetus being a very nice catenary curve sketch by Dominik Strzelec. Dominik used Toxiclibs to model a surface of interconnected springs (a piece of billowing fabric, essentially) with gravity and relaxation constraints making things interesting. The way it was set up the simulation reproduced the famous catenary curve effect (although with an actual surface rather than just chains).

The catenary curve is a geometrical gem famously simulated and used as a form-generating process by the very pre-digital architect Gaudi:

Wikipedia: Catenary Curve In physics and geometry, the catenary is the curve that an idealized hanging chain or cable assumes under its own weight when supported only at its ends.

The curve has a U-like shape, superficially similar in appearance to a parabola (though mathematically quite different). It also appears in the design of certain types of arches and as a cross section of the catenoid—the shape assumed by a soap film bounded by two parallel circular rings.

I rarely if ever use “proper” simulation of processes physics, chemical processes or artifical intelligence in my work, but I was intrigued by the organic mesh structure produced by the simulation. Downloading Dominik’s code and predictably scaling up the complexity of the mesh and forces involved quickly made me realize that I have been missing out (although I do remain skeptical of algorithms that can be easily reproduced.) The tipping point came when I applied per-vertex color shading haxxoring, the graphic craziness that ensued had me lost for hours…

The posted code is just one of many sketches exploring the system. I did end up converting up the mesh and color logic to Modelbuilder since I’m less familiar with Toxiclibs, but I chose to post a version using Toxiclibs since it’s closer to the original code. The posted example has very random colors, producing palettes that I would consider unacceptable 80% of the time. The principles of mapping a 1D color list to a 2D color mesh was the focus of interest, rather than the actual colors themselves.

For a more complete index of the Catenary Madness sketches, see Flickr:

Catenary Madness Index-1500x1500

, , , , , ,