GSoC 2026: Join the Processing Foundation as a Summer of Code Contributor!

Hello! Thanks everyone who have contributed to this thread so far

We’ve updated the Processing GSOC FAQ to reflect the conversations, and be up-to-date with 2026 key info & links.

We are currently working to add sections per project. You will see E2e Testing for p5.js FAQ there.

Other projects will be adding to the FAQ soon per discussion on this thread. Please continue to add questions for individual projects in this thread, so that information shared is accessible to all applicants.

3 Likes

I think it is mentioned in the template which @kit have posted, it contains which section should be of how many pages . You can also refer to this GSoC 2026: Join the Processing Foundation as a Summer of Code Contributor! - #123

2 Likes

Hi everyone :sunflower:

Just a brief reminder: Kam (2025 GSoC contributor) and Rachel (p5.js Editor project lead) will be online on March 20th (Friday) at 14:00 CET on the p5.js Discord #voice to talk a bit about their project - the autocomplete hinter - and share their experiences. Invite here

Kam has also shared their successful proposal. Last year, we did not have a template. Even with this year’s template, we expect people to have different voices and styles. However, I hope Kam’s proposal can demonstrate how to use screenshots and links very effectively. It’s technically detailed, clear, and well-aligned with community and project values.

Thanks to @clairep94 for updating the FAQ! I have linked it from other places, too.

@harshil: I confirm that your feedback request has been received! We try to do feedback as soon as possible but it might take longer depending which project it is. Absolute latest is March 25th but since you submitted so early (thank you!) it should be in the next day or two. (EDIT: I have emailed feedback to you / all requests made on March 15th.)

This form will be open from March 16 until March 22 (inclusive). You will receive an answer at the latest by end of day March 25th; earlier if you submit a request earlier, or if your request is very short and focused (1-2 pages). GSoC’s final submission deadline is March 31st.

Best,
Kit

2 Likes

thankyou @harshil for replying, I’ll check the linked post.

1 Like

Hi everyone!

I’ve been reflecting more on where I can make the most meaningful contribution this summer, and after spending time exploring several project areas, I find myself most drawn to the Code Translation between Processing Sound and p5.sound.js idea.

A bit of context on why the shift: I initially came in thinking about the e2e testing direction given my TypeScript background, but the more I read through the sound idea — and the more time I spent actually running sketches with both p5.sound.js and Processing Sound — the more I realized this is the project where my full-stack skills and genuine curiosity about the problem intersect most naturally. Building a tool that lets a Processing Sound sketch travel to the web feels like the kind of thing I’d want to exist regardless of GSoC.

I’ve started doing the groundwork — mapping the API surface between the two libraries method by method, looking at where the translation is straightforward (e.g. SoundFile → p5.SoundFile is mostly a namespace difference) and where it genuinely diverges (e.g. Processing Sound’s SinOsc.play(freq, amp) vs p5.sound.js splitting frequency and amplitude across separate calls on the oscillator). That structural mapping is going to be the foundation of the proposal’s technical section.

For the implementation, I’m thinking a React-based web tool with a bidirectional code editor (Processing Sound ↔ p5.sound.js), a live p5.js preview pane, and a warning/annotation layer for patterns that have no clean equivalent. The “especially excited about creative approaches” line in the idea description has me thinking about whether the tool could do more than translate — maybe surface the conceptual difference with an explanation, not just emit code.

A question for @kit (or Kevin Stadler if he’s around): in the idea description’s note about being excited by approaches that go beyond literal code translation — is there a specific educator or artist workflow pain point behind that phrasing? I want to make sure the proposal addresses a real gap, not just what seems elegant technically.

Thanks for keeping this thread so well-organized — it’s been genuinely useful.

Best,
Samarth

2 Likes

I had some questions about building out video and sound libraries for L5 sound (one of the ideas from the Project Ideas List).

The questions were about how the L5sound and L5video libraries might be structured since we are so far trying to build with minimal dependencies. At this point, I do not know the exact structure as research is needed, but I will describe what I think is possible.

L5 is the core library with its own repository. L5sound and L5video would each be their own separate libraries, just like how p5.sound is an addon separate, extra, optional library to be added along with p5.js. Figuring out the L5sound and L5video libraries’ structure would be developed during the GSoC period. Here’s some brainstorming of potential structure:

For L5sound, the Luafft library (if we use that) would likely lead to a structure something like:

L5sound/
  ├── L5sound.lua
  ├── vendor/
  │   └── luafft/
  ├── examples/
  └── README.md

But perhaps ideally we could inline luafft so that the end result is a single drop-in file, and so a user would only need to do:

require("L5")
require("L5sound")

For L5video, we have binary dependencies likely, like webcam.so or webcam.dll, so the best we may be able to do is structure it:

L5video/
  ├── L5video.lua (single Lua file)
  └── lib/
      ├── webcam.so
      ├── webcam.dll
      └── webcam.dylib 

and then provide a L5video.zip with everything bundled.

As an FYI FFI is built into Love2d, our underlying framework, and allows calling C functions from .so/.dll or .dylib for example since we’ll need a webcam library as compiled C, then we use FFI to call it from our Lua code.

Inside L5video.lua we could detect platform and load the appropriate library:

-- L5video.lua
local ffi = require("ffi")

-- Detect platform and load appropriate library
local platform = love.system.getOS()
local webcam_lib

if platform == "Windows" then
  webcam_lib = ffi.load("lib/webcam.dll")
elseif platform == "Linux" then
  webcam_lib = ffi.load("lib/webcam.so")
elseif platform == "OS X" then
  webcam_lib = ffi.load("lib/webcam.dylib")
end

A L5 coder wanting to use the L5video library may then import with something like:

require("L5")
require("L5video")

This all may change, or information may turn out to be incorrect later, but I wanted to respond to questions with as clear info as I can at this point. Thanks.

3 Likes

Hi everyone,
I’ve been working on the eyedropper debugging proposal and would love early feedback from anyone who debugs shaders with p5.strands - specifically, what information would be most useful to display alongside the RGBA values? Pixel coordinates, hex equivalent, or something else?

1 Like

Hey @khushiMishra, I’m also working on eyedropper debugging tool and I think displaying pixel coordinates (x,y) would be most useful as it will tell you exactly which pixel you’re inspecting. would love to hear what other’s think.

1 Like

Great to see more people thinking about this! From debugging p5.strands shaders myself, I found that showing pixel coordinates alongside normalized 0-1 RGBA values feels better since that’s exactly the range shader code works in. The moment you can hover over a pixel and see for e.g R:0.48 and directly connect it to the value in your strand code, the whole debugging experience changes.

1 Like

Hi everyone! I’m Ashu, a B.Tech student and Frontend Developer/UI Designer from India.

I’ve been exploring the p5.js Web Editor project ideas for GSoC 2026. After auditing the current interface, I felt inspired to create a high-fidelity Modern Dark Mode Concept in Figma to improve the visual hierarchy and accessibility of the editor.

My goal is to bring a professional, IDE-like experience (similar to VS Code) to the web editor while keeping the p5.js brand identity. I’d love to hear from the mentors—would a UI/UX overhaul and Component Library modernization be a priority for this year?

I’m currently diving into the React codebase and look forward to contributing!

1 Like

Thanks @Geeta112 and @aash.u7707 - this is really helpful! Combining pixel coordinates with normalized RGBA seems a good approach: coordinates for precise inspection, and normalized values to directly relate to shader logic.

2 Likes

Hello everyone, I’m writing a proposal for GSOC 2026 under the Processing Foundation and wanted to know if there any proposal format docs which have been shared. I am following this currently: Processing-Foundation-GSoC/ProposalTemplate.md at main · processing/Processing-Foundation-GSoC · GitHub but if there’s a diffrerent one that needs to be followed then please let me know. Thanks :slight_smile:

1 Like

@kathrina you are following the correct template. It also includes examples that you can refer to prepare your proposal.

2 Likes

Hi @kit thank you so much for the valuable feedback. I’ve done the changes according to it and have once again posted it on the feedback form. When should I expect a response, will it be after 25th of March or earlier?

1 Like

Hi everyone :sunflower:

Everyone who submitted a feedback request March 18th or earlier has now received feedback. @harshil I can see your second request, likely earlier than March 25th! We can’t guarantee but it’s been going pretty efficiently:)

@Samarth , welcome! That’s a great approach to that project. The mapping of matching vs diverging functions would be very valuable. I’ve also forwarded your question to Kevin. We also do have a short informal call tomorrow on discord.p5js.org with current maintainer of p5.sound.js, Tommy Martinez, doing a coding demo, so please feel free to join.

in the idea description’s note about being excited by approaches that go beyond literal code translation — is there a specific educator or artist workflow pain point behind that phrasing? I want to make sure the proposal addresses a real gap, not just what seems elegant technically.

This is a good question, the main educator/student challenge overall is shareability/portability. From a recent survey (2025 Processing Community Survey, which will be published in a few months, and which had ~500 responses):

  • About a third of all responses identified as educators. Of these, ~20% of teachers who use Processing (in general, not just sound) say that sharing project files is the most challenging thing in teaching. This is one reason that teachers use p5.js in some cases.
  • From all responses (including all self-identifying label), ~17% use Processing Sound. More complete analysis is not yet available.

If there is a contributor this year working on a project related to Sound, I’d be happy to help with outreach to artists and educators from this survey (who said they want to be contacted for follow-up) to better understand their pain-points and test the translation tool.

In general, all proposals should integrate community feedback, that is an example of what feedback can look like for this one.

@kathrina , welcome! Yes, as @NishthaJain said, that’s the correct template, thanks for sharing the infor.

@ashuverma25 , welcome! For the web editor: keep in mind that there are also VSCode extensions for users wanting a more professional development environment. The primary use of the p5.js web editor is schools, and it is intentionally very minimal. While a proposal of redesigning the editor in this way is possible, it would require a strong supporting research. I would additionally suggest not proposing anything that directly overlaps / conflicts with ongoing work, like CodeMirror 6 Feature Branch · Issue #3814 · processing/p5.js-web-editor · GitHub I hope you join Kam’s Discord call tomorrow (check events in discord.p5js.org) as this may help with writing a p5.js editor related proposal.

Wonderful to see all the ideas about the eyedropper proposal as well:)

Best,
Kit

1 Like

Hi @kit! Thank you so much for the feedback and the warm welcome.

I completely understand the focus on the CodeMirror 6 (CM6) migration and the importance of keeping the editor minimal for educational use. I’ve spent some time today reviewing Issue #3814 and the ongoing sub-issues to make sure I’m aligned with the current technical roadmap.

My proposal would be specifically focused on the UI shell and accessibility improvements—ensuring that the new CM6 features have a clean, modern home without adding unnecessary complexity. I’m especially interested in how we can make the interface more intuitive for students while providing the ‘pro’ feel of a modern IDE through a refined dark mode and better component hierarchy.

I’ll definitely be joining the Discord call tomorrow to listen in and learn more about the current pain points. Looking forward to it!

1 Like

Hi @kit,

Thank you for the update!

I had submitted my feedback request on March 17 and just wanted to confirm whether it has been received. I checked my inbox but might have missed it.

Thanks again for all the effort you’re putting into reviewing proposals!

1 Like

Hi @kit. I’ve uploaded my draft proposal to the signup form. Could you please confirm whether it has been recieved?
Thank you.

1 Like

Hey everyone! Bit of an odd time to be introducing myself here, I somehow missed this forum so I’m a little late here.

My name is Nitish and I am a design student from India, I am looking to contribute to the L5 project. Over the last couple days I’ve created a knowledge base of a sort for things that helped me understand the L5 codebase and start contributing to it. I have linked them here, do check them out and let me if you guys found any others.

  • The lua-users wiki was a godsend because the official documentation is literally a book.
  • The p5 and p5 sound references is a great place to learn the ‘processing-style’ api which L5 plans to implement.
  • This great article that teaches you how p5sound is consumed

I have one question for @lee does L5video library concern itself with playback only or does it also expose a capture api?

2 Likes

Hi @Khushimishra Did you fill out the signup form? I don’t see your name there.

@reshma045 I see yours :check_box_with_check:

1 Like