Processing PULSE (Collaboration Project)

I have developed an idea called Processing PULSE

Essentially it will be a collection of ideas and sketches by developers who want to collaborate using Processing.

Any one who may be interested, message me here or inbox me at
karl@adaptinnovative.com
kaotisian@gmail.com

or whatsapp me at +1 (246) 267-1826

1 Like

Like what ideas?

In what area,3D…?

Hi @Chrisir ,

I am a fan of 2D and 3D. PULSE will be a project where data will drive both data visualization as well as PULSE

The best way I can describe PULSE is Processed Universal Laguage Symettrical Emulation.

Essentially we all deal with data as binary- zero and ones. In PULSE I store data in a myriad of ways,

  • pixels
  • audio
  • animation
  • even light

So currently I use it to power my 3d sketches, my 2d sketches in processing and now I am moving towards building data models for directing movement and simulating choice based actions in software. So I wanted to see if persons here on forum.processing.org wanted to collaborate on the development side of the project as well as collaborate on the content and application (sketches, demos, apps, etc)

If you need examples let me know I can post some sketches here or on openprocessing.org to demonstrate PULSE in action.

2 Likes

processingpulse

1 Like

Hi @adaptinnovative – can you explain a bit more what Pulse is?

  • Is it a web host, like openprocessing?
  • Is it a library, like PixelFlow?
  • Is it a library / framework, like Hype?
  • Is it a protocol, like OSC?

hi @jeremydouglass I will enclose a screenshot of PULSE in coding
You can see below two (2) windows
Roll captures the pulse data as it moves across the x-axis using (sin,cos or tan)

The 2nd window is how the data is used to manipulate (images, sound, 3d textures, etc) Once its on screen the pulse data can manipulate it. So its more of a framework like Hype but encapsulated as either a localized machine protocol or using a http server for remote access.

1 Like

Very interesting. It sounds like the “roll” is a core part of the pipeline, and the screen / canvas / media manipulation is downstream from that. So are the roll contents created by e.g. oscillators or sound files / audio inputs? Is it a signal processing framework for things like video synthesis, something like Hydra?

Yes @jeremydouglass you have the gist of the idea. However I am taking a bit further and using it to synthesize anything on screen. Trees, landscapes, shadows, dust, fog , etc. all powered by a singular architecture called PULSE. Here is a youtube video of it in action. The 2nd window is pure pixel code and it is using one pixel PULSE file to control the location of the mouse.

1 Like

I am also looking at using Google’s Fluttr to develop cross application app(s) between Processing and Fluttr.

Interesting. Is that using p5.dart? https://pub.dev/packages/p5

1 Like

Thank you sooo much for that link.

Yes I plan to use it as a bridge between APDE , Processing and Flutter.

Are you interested in contributing to the project? @jeremydouglass

Glad it was helpful!

Unfortunately I have no experience with the Dart programming language.

However if you post a public open source repo for the project for contributors then I would be happy to take a look.

Hi @jeremydouglass

Hi me either…

That’s the project. You code in processing and Processing PULSE creates a framework
So you can code in Processing. You don’t need dart language to assist on the processing.

Processing knowledge is a benefit but if you just want to contribute in any language or idea you can.