Neural style transfer in processing

Here are some pointers for getting started with Processing and neural style transfer. (I am not an expert):

perspective

An interesting overview from Hannu Töyrylä on switching from using Processing to neural style transfer, then re-incorporating it into a Processing workflow.

http://liipetti.net/erratic/2016/12/11/neural-networks-style-transfer-and-artistic-process/

…more related posts:

http://liipetti.net/erratic/category/neural-networks/

try it out

If you are looking to try out a working sketch in Processing, Bryan Chung of Magic & Love has a neural style blog post with results from a Processing (Java) sketch that applies pre-trained Torch models to live video.

The Processing sketch is here:

I believe Bryan gets the models from Johnson et al’s “fast-neural-style” feed-forward neural transfer system. That code (and a model download script) is here:

…specifically, I think the sketch needs the resource “composition_vii.t7”, which is available here:

implement a full system

If you are interested in doing it end-to-end in Processing(Java), one way might be to incorporate dl4j. They have a neural style transfer example in their examples list, and you could try include the library and the example class in a Processing sketch and then calling it.

There is a related dl4j Java tutorial here:

https://dzone.com/articles/java-art-generation-with-neural-style-transfer

Note that another route is using TensorFlow.

do it with python

If you are thinking about Python rather than Java,

There is an introductory walkthrough in Python here, there is a great step-by-step tutorial walkthrough here:

…so you could walk through the example in Python 3 and then use p5py to incorporate it into a Processing-like API.

Note this requires p5py, as the examples are written in Python 3 and more importantly use PyTorch, which I believe is not compatible with Processing.py (Python mode) even in older 2.7-compatible versions due to the reliance on CPython.

2 Likes