For a week in April, Auckland artist Simon Ingram will be present in the Gallery. During this time, his brain activity will be monitored by a consumer-grade electroencephalogram headset (EEG), interpreted by custom software, and materialised in paintings produced in situ by a machine he developed, which produces painterly 'expressive' marks.
Neuro-Action-Painting combines new technology (EEG, robotics) with traditional painting language and tools (canvas, oil paint, brush). It is the latest chapter in Simon Ingram's longstanding inquiry into how ideas from radio astronomy, computer science, artificial intelligence, physics, and engineering can be used to develop painting machines that make invisible energies visible.
Ingram’s works have typically bypassed the artist. For instance, some were determined using radio waves. But recently Ingram had a change of heart. He says: ‘I found that I wanted to paint again, to insinuate myself into the framework I had sought to be excluded from. The EEG provided a way to return to the human subject and “interiority” that I had sought to exclude in my Radio Paintings, but by similar means.’
Ingram’s EEG data will generate a line that wanders around the domain of the canvas. His beta waves will determine the length of the painted lines; his alpha waves whether the line turns left or right. The machine is programmed to avoid the path it has previously traced, by 'tunnelling under' or glancing off any existing line it encounters. Complicating the experiment, Ingram will also be able to consciously intervene, by drawing directly into the application’s virtual domain as his brain’s electrical impulses are being received and interpreted by it. The finished paintings, then, may scramble drawing by hand (with agency) and drawing by brain (without).