NakedNous

✨ Welcome to my blog! ✨

I’m Jean Pierre Charalambos, researcher & educator passionate about creating didactic tools through creative coding and visual computing. I build open-source tools to make learning more playful 🖥️📚.

🚀 Active Projects

nub p5.quadrille.js p5.platonic p5.tree

🧊 Inactive Projects

maku shaderbase proscene p5.treegl legacy p5.bitboard

🎓 Teaching Resources

Visual Computing Course OOP Course

📚 Recent Publications

nub JORS Paper p5.quadrille.js SoftwareX Paper
🧘 Yoga practitioner | 🏊 Open water swimmer | 📖 Science fiction | 🧠 Philosophy of mind

Tree

code 'use strict' let layer let models let focusVal = 0 let ui let dofFilter, pixelatorFilter, noiseFilter let font // explicit scene camera (the one we animate) — MUST belong to the framebuffer renderer let sceneCam // toggles let showAxes = true let showGrid = true // camera path UI state (we keep it ourselves; no _renderer access) let pathLoop = true let pathPlaying = false let pathDuration = 45 // frames per segment let pathRate = 1 let pathKeyframes = 0 // we update on add/reset; no introspection // seek slider (DOM) let sSeek // post FX toggles + order let fx let fxOrder = 1 // 1..3 (preset orders) let cNoise, cPixel, cBlur function dofCallback () { const depthTex = uniformTexture(() => layer.depth) const focus = uniformFloat(() => focusVal) const dofIntensity = uniformFloat(() => ui.dofIntensity.value()) const getBlurriness = (d) => abs(d - focus) * 40 * dofIntensity const maxBlurDistance = (b) => b * 0.01 getColor((inputs, canvasContent) => { let colour = getTexture(canvasContent, inputs.texCoord) let samples = 1 const centerDepth = getTexture(depthTex, inputs.texCoord).r const dofriness = getBlurriness(centerDepth) for (let i = 0; i < 20; i++) { const angle = float(i) * TWO_PI / 20 const dofDistance = float(i) / 20 * maxBlurDistance(dofriness) const offset = [cos(angle), sin(angle)] * dofDistance const sampleDepth = getTexture(depthTex, inputs.texCoord + offset).r const sampleBlurDist = maxBlurDistance(getBlurriness(sampleDepth)) if (sampleDepth >= centerDepth || sampleBlurDist >= dofDistance) { colour += getTexture(canvasContent, inputs.texCoord + offset) samples++ } } colour /= float(samples) return [colour.rgb, 1] }) } function pixelatorCallback () { const level = uniformFloat(() => ui.level.value()) getColor((inputs, canvasContent) => { let stepCoord = inputs.texCoord * level stepCoord = floor(stepCoord) stepCoord = stepCoord / level const colour = getTexture(canvasContent, stepCoord) return [colour.rgb, 1] }) } function noiseCallback () { const frequency = uniformFloat(() => ui.frequency.value()) const amplitude = uniformFloat(() => ui.amplitude.value()) const speed = uniformFloat(() => ui.speed.value()) const hash = (p) => fract(sin(dot(p, [127.1, 311.7, 74.7])) * 43758.5453123) const fade = (t) => t * t * (3 - 2 * t) const valueNoise3 = (p) => { const i = floor(p) const f = fract(p) const u = fade(f) const n000 = hash(i + [0, 0, 0]) const n100 = hash(i + [1, 0, 0]) const n010 = hash(i + [0, 1, 0]) const n110 = hash(i + [1, 1, 0]) const n001 = hash(i + [0, 0, 1]) const n101 = hash(i + [1, 0, 1]) const n011 = hash(i + [0, 1, 1]) const n111 = hash(i + [1, 1, 1]) const nx00 = mix(n000, n100, u.x) const nx10 = mix(n010, n110, u.x) const nx01 = mix(n001, n101, u.x) const nx11 = mix(n011, n111, u.x) const nxy0 = mix(nx00, nx10, u.y) const nxy1 = mix(nx01, nx11, u.y) return (mix(nxy0, nxy1, u.z) * 2) - 1 } getColor((inputs, canvasContent) => { const t = speed * (millis() / 1000) const s = frequency * inputs.texCoord.x const v = frequency * inputs.texCoord.y const n1 = valueNoise3([s, v, t]) const n2 = valueNoise3([s + 17, v, t]) const texCoords = inputs.texCoord + [amplitude * n1, amplitude * n2] const colour = getTexture(canvasContent, texCoords) return [colour.rgb, 1] }) } function fxList () { const enabled = (name) => fx[name] && fx[name].enabled() const pick = (name) => (enabled(name) ? fx[name].shader : null) const presets = { 1: ['noise', 'pixelator', 'dof'], 2: ['pixelator', 'dof', 'noise'], 3: ['dof', 'noise', 'pixelator'] } const ord = presets[fxOrder] || presets[1] return ord.map(pick).filter(Boolean) } function fxOrderLabel () { if (fxOrder === 1) return '1: noise -> pixelator -> dof' if (fxOrder === 2) return '2: pixelator -> dof -> noise' if (fxOrder === 3) return '3: dof -> noise -> pixelator' return '' } function syncFxUI () { const noiseOn = cNoise.checked() const pixelOn = cPixel.checked() const dofOn = cBlur.checked() ui.frequency.visible = noiseOn ui.amplitude.visible = noiseOn ui.speed.visible = noiseOn ui.level.visible = pixelOn ui.dofIntensity.visible = dofOn } async function setup () { createCanvas(600, 420, WEBGL) font = await loadFont('noto_sans.ttf') textFont(font) layer = createFramebuffer() layer.begin() sceneCam = layer.createCamera() layer.end() ui = createUniformUI({ frequency: { min: 0, max: 10, value: 3, step: 0.1, label: 'noise frequency' }, amplitude: { min: 0, max: 1, value: 0.3, step: 0.01, label: 'noise amplitude' }, speed: { min: 0, max: 1, value: 0.3, step: 0.01, label: 'noise speed' }, level: { min: 2, max: 900, value: 300, step: 1, label: 'pixelator level' }, dofIntensity: { min: 0, max: 4, value: 1.5, step: 0.1, label: 'dof intensity' } }, { x: 10, y: 10, width: 170, labels: true, title: 'Post FX', color: 'white' }) noiseFilter = baseFilterShader().modify(noiseCallback) pixelatorFilter = baseFilterShader().modify(pixelatorCallback) dofFilter = baseFilterShader().modify(dofCallback) // FX toggles (checkboxes) cNoise = createCheckbox('noise', false) cPixel = createCheckbox('pixelator', false) cBlur = createCheckbox('dof', true) ;[cNoise, cPixel, cBlur].forEach((c, i) => { c.position(10, 10 + 260 + 12 + i * 20) c.style('color', 'white') }) cNoise.changed(syncFxUI) cPixel.changed(syncFxUI) cBlur.changed(syncFxUI) fx = { noise: { shader: noiseFilter, enabled: () => cNoise.checked() }, pixelator: { shader: pixelatorFilter, enabled: () => cPixel.checked() }, dof: { shader: dofFilter, enabled: () => cBlur.checked() } } syncFxUI() pathPlaying = false pathKeyframes = 0 sSeek = createSlider(0, 1, 0, 0.001) sSeek.input(() => { sceneCam.stopPath() pathPlaying = false sceneCam.seekPath(sSeek.value()) }) sSeek.position(width / 2 + 50, height - 50) sSeek.style('width', '220px') syncSeekUI() const trange = 200 models = [] for (let i = 0; i < 50; i++) { models.push({ position: createVector( (random() * 2 - 1) * trange, (random() * 2 - 1) * trange, (random() * 2 - 1) * trange ), size: random() * 25 + 8, color: color(int(random(256)), int(random(256)), int(random(256))), type: i === 0 ? 'ball' : i < 25 ? 'torus' : 'box' }) } console.log(p5.Tree.VERSION) } function draw () { background(10) if (pathKeyframes >= 2 && pathPlaying) { sSeek.value(sceneCam.pathTime()) } layer.begin() setCamera(sceneCam) background(0) orbitControl() stroke(180, 90) showGrid && grid({ size: 500, subdivisions: 20 }) showAxes && axes({ size: 220 }) noStroke() ambientLight(100) const direction = mapDirection(p5.Tree._k, { from: p5.Tree.EYE, to: p5.Tree.WORLD }) directionalLight(255, 255, 255, direction.x, direction.y, direction.z) specularMaterial(255) shininess(150) models.forEach(model => { push() fill(model.color) translate(model.position) model.type === 'box' ? box(model.size) : model.type === 'torus' ? torus(model.size) : sphere(model.size) pop() }) focusVal = mapLocation(models[0].position, { from: p5.Tree.WORLD, to: p5.Tree.SCREEN }).z layer.end() pipe(layer, fxList()) drawHud() } function drawHud () { const pad = 10 const panelW = 240 const x0 = width - panelW - pad const y0 = pad const lh = 16 const lines = [ 'p5.tree: post FX + keyframes', '', 'Post FX', ` [1/2/3] order: ${fxOrderLabel()}`, ` toggles: noise=${fx.noise.enabled() ? 'on' : 'off'} pixelator=${fx.pixelator.enabled() ? 'on' : 'off'} dof=${fx.dof.enabled() ? 'on' : 'off'}`, '', 'Hints', ` [G] grid: ${showGrid ? 'on' : 'off'}`, ` [X] axes: ${showAxes ? 'on' : 'off'}`, '', 'Keyframes / Path', ' [A] add keyframe (addPath snapshot)', ' [N] pathInfo()', ` [P] play/stop loop=${pathLoop ? 'on' : 'off'} rate=${pathRate}`, ' [R] resetPath()', ' [L] toggle loop', ' [<] reverse rate', ' [>] forward rate', ` duration: ${pathDuration} f/seg`, ` keyframes: ${pathKeyframes}`, ` state: ${pathPlaying ? 'playing' : pathKeyframes === 1 ? 'single keyframe' : 'stopped'}` ] beginHUD() push() noStroke() fill(0, 180) rect(x0, y0, panelW, pad + lines.length * lh + pad, 8) fill(255) textSize(12) textAlign(LEFT, TOP) let y = y0 + pad for (let i = 0; i < lines.length; i++) { text(lines[i], x0 + pad, y) y += lh } pop() endHUD() } function syncSeekUI () { if (pathKeyframes < 2) { sSeek && sSeek.hide() return } sSeek && sSeek.show() sSeek.value(constrain(sSeek.value(), 0, 1)) } function onPathChanged (opt = {}) { const { keepPose = true } = opt sceneCam.stopPath() pathPlaying = false if (!keepPose) { sSeek && sSeek.value(0) pathKeyframes >= 1 && sceneCam.seekPath(0) } syncSeekUI() } function keyPressed () { if (key === 'g' || key === 'G') { showGrid = !showGrid; return true } if (key === 'x' || key === 'X') { showAxes = !showAxes; return true } if (key === '1' || key === '2' || key === '3') { fxOrder = int(key); return true } if (key === 'a' || key === 'A') { sceneCam.addPath() pathKeyframes++ if (pathKeyframes === 2) sSeek && sSeek.value(1) onPathChanged({ keepPose: true }) return true } if (key === 'n' || key === 'N') { sceneCam.pathInfo() return true } if (key === 'l' || key === 'L') { pathLoop = !pathLoop if (pathPlaying) { sceneCam.playPath({ duration: pathDuration, loop: pathLoop, rate: pathRate, onEnd: () => { pathPlaying = false; sSeek.value(sceneCam.pathTime()) } }) } return true } if (key === '>') { pathRate = 1 if (pathPlaying) { sceneCam.playPath({ duration: pathDuration, loop: pathLoop, rate: pathRate, onEnd: () => { pathPlaying = false; sSeek.value(sceneCam.pathTime()) } }) } return true } if (key === '<') { pathRate = -1 if (pathPlaying) { sceneCam.playPath({ duration: pathDuration, loop: pathLoop, rate: pathRate, onEnd: () => { pathPlaying = false; sSeek.value(sceneCam.pathTime()) } }) } return true } if (key === 'p' || key === 'P') { if (pathKeyframes === 0) return true if (pathKeyframes === 1) { sceneCam.stopPath() pathPlaying = false sceneCam.playPath({ duration: pathDuration, loop: false, rate: 1 }) syncSeekUI() return true } if (!pathPlaying) { sceneCam.playPath({ duration: pathDuration, loop: pathLoop, rate: pathRate, onEnd: () => { pathPlaying = false; sSeek.value(sceneCam.pathTime()) } }) pathPlaying = true } else { sceneCam.stopPath() pathPlaying = false } return true } if (key === 'r' || key === 'R') { sceneCam.resetPath() pathKeyframes = 0 sSeek && sSeek.value(0) onPathChanged({ keepPose: false }) return true } return false } function mouseWheel () { return false }

February 17, 2026 · 8 min · Theme PaperMod

nub & p5.treegl: Advancing Visual Computing on the Web

Tree-like affine transformation hierarchies are at the core of many tasks in rendering, interaction, and computer vision—from view frustum & occlusion culling and collision detection to motion retargeting and post-WIMP interfaces. Our recent publication, nub: A Rendering and Interaction Library for Visual Computing in Processing, introduces a functional and declarative API, built around a dataflow-based architecture that integrates rendering and event-driven interaction through a simple yet powerful scene graph model. Built on top of Processing’s 2D/3D environment, nub offers a lightweight and expressive foundation for education, research, and experimentation in visual computing. It supports hierarchical rendering, multi-view scenes, view-based interaction, and extensible workflows for interactive content. This post presents an overview of its architecture and capabilities, and outlines future work extending it to the web through p5.treegl, with research directions focused on scene graphs, picking, gesture-based control, post-effects, and AI-assisted visual computing. ...

April 1, 2025 · 3 min · Theme PaperMod

Brush Rosetta

This Rosetta demo is intended to give readers a starting point for porting the Brush VR app to three.js and WebGL2, highlighting some of the lower-level aspects of managing shaders. code let color, depth, brush, escorzo = true, fallback = [], points = [], record let basicShader function preload() { loadJSON('/cloud_500.json', json => fallback = json.map(entry => ({ worldPosition: createVector(entry.x, entry.y, entry.z), color: entry.color })) ) } function setup() { createCanvas(600, 400, WEBGL) colorMode(RGB, 1) document.oncontextmenu = () => false points = [...fallback] const o = parsePosition(Tree.ORIGIN, { from: Tree.WORLD, to: Tree.SCREEN }) depth = createSlider(0, 1, o.z, 0.001) depth.position(10, 10) depth.style('width', '580px') color = createColorPicker('#C7C08D') color.position(width - 70, 40) brush = sphereBrush basicShader = parseShader(`#version 300 es precision highp float; uniform vec4 uMaterialColor; out vec4 fragColor; void main() { fragColor = uMaterialColor; }`, Tree.pMatrix | Tree.vMatrix | Tree.mMatrix) } function draw() { (mouseY >= 30) && orbitControl() shader(basicShader) record && update() background('#222226') axes({ size: 50, bits: Tree.X | Tree.Y | Tree.Z }) for (const point of points) { push() translate(point.worldPosition) brush(point) pop() } } function keyPressed() { key === 'c' && (points = []) key === 'f' && focus() key === 'r' && (record = !record) key === 's' && saveCloud() } function update() { points.push({ worldPosition: parsePosition([mouseX, mouseY, depth.value()], { from: Tree.SCREEN, to: Tree.WORLD }), color: color.color(), }) } function focus() { const center = [0, 0, 0] const position = parsePosition() const up = parseDirection(Tree.j) camera(...position.array(), ...center, ...up.array()) const o = parsePosition(Tree.ORIGIN, { from: Tree.WORLD, to: Tree.SCREEN }) depth.value(o.z) } function sphereBrush(point) { push() noStroke() fill(point.color) sphere(1) pop() } function saveCloud() { const data = points.map(point => ({ x: point.worldPosition.x, y: point.worldPosition.y, z: point.worldPosition.z, color: [red(point.color) / 255, green(point.color) / 255, blue(point.color) / 255, alpha(point.color) / 255] })) saveJSON(data, 'custom_cloud.json') } Shader Declaration A key difference in this version is the use of parseShader, which defines a basic shader that handles color through the uMaterialColor uniform: ...

November 19, 2024 · 3 min · Theme PaperMod

3D Brush Painting in VR

The following 3D brush painting algorithm integrates depth control for VR experiences by leveraging p5.treegl’s parsePosition function. This allows users to paint dynamically in 3D space, using mouse input and a depth slider to place brush strokes with precision. code 'use strict' let color, depth, brush, escorzo = true, fallback = [], points = [], record function preload() { loadJSON('cloud_500.json', json => fallback = json.map(entry => ({ worldPosition: createVector(entry.x, entry.y, entry.z), color: entry.color })) ) } function setup() { createCanvas(600, 400, WEBGL) colorMode(RGB, 1) document.oncontextmenu = () => false points = [...fallback] const o = parsePosition(Tree.ORIGIN, { from: Tree.WORLD, to: Tree.SCREEN }) depth = createSlider(0, 1, o.z, 0.001) depth.position(10, 10) depth.style('width', '580px') color = createColorPicker('#C7C08D') color.position(width - 70, 40) brush = sphereBrush } function draw() { (mouseY >= 30) && orbitControl() record && update() background('#222226') axes({ size: 50, bits: Tree.X | Tree.Y | Tree.Z }) for (const point of points) { push() translate(point.worldPosition) brush(point) pop() } } function keyPressed() { key === 'c' && (points = []) key === 'f' && focus() key === 'r' && (record = !record) key === 's' && saveCloud() } function sphereBrush(point) { push() noStroke() fill(point.color) sphere(1) pop() } function update() { points.push({ worldPosition: parsePosition([mouseX, mouseY, depth.value()], { from: Tree.SCREEN, to: Tree.WORLD }), color: color.color(), }) } function focus() { const center = [0, 0, 0] const position = parsePosition() const up = parseDirection(Tree.j) camera(...position.array(), ...center, ...up.array()) const o = parsePosition(Tree.ORIGIN, { from: Tree.WORLD, to: Tree.SCREEN }) depth.value(o.z) } function saveCloud() { const data = points.map(point => ({ x: point.worldPosition.x, y: point.worldPosition.y, z: point.worldPosition.z, color: [red(point.color) / 255, green(point.color) / 255, blue(point.color) / 255, alpha(point.color) / 255] })) saveJSON(data, 'custom_cloud.json') } Key Elements of the 3D Brush Algorithm The primary functionality revolves around transforming user input from screen space to world space, enabling dynamic point placement. Using the parsePosition function, we map 2D screen coordinates and depth values into the 3D world, allowing the brush to paint accurately in VR space. ...

October 4, 2024 · 3 min · Theme PaperMod

Platonic Cells

This demo illustrates new capabilities of the WebGL mode in the next major upcoming version of p5.quadrille.js, currently under development. It showcases how Platonic solids can be stored in quadrille cells and rendered using either immediate or retained mode with the p5.platonic library. Platonic Cells Platonic cells are cell functions (cellFn) that implement the filling of Platonic solid cells in a quadrille game. Retained Mode (mouse click to clear/add Platonic solids, drag to navigate; press s (or c) to save) ...

June 5, 2024 · 5 min · Theme PaperMod

Platonic Solids

Platonic solids, named after the philosopher Plato, are highly symmetrical, three-dimensional shapes. Each face of a Platonic solid is the same regular polygon, and the same number of polygons meet at each vertex. This sketch demonstrates the rendering of Platonic solids using the p5.platonic library, showcasing both the immediate mode and retained mode rendering of the shapes. Check out the platonic cells example, which fills quadrille cells with the Platonic solids introduced here. ...

May 17, 2024 · 4 min · Theme PaperMod

Toon shading

Toon shading, or cel shading, is a non-photorealistic rendering technique that gives 3D graphics a cartoon-like appearance, commonly seen in various forms of visual media, such as video games and animations. The outlined toon shader achieves the effect signature flat look by quantizing diffuse reflection into a finite number of discrete shades. The makeShader function parses the fragment shader source code to create a vertex shader and an interactive user interface, returning a toon p5.Shader that applyShader then uses for interactive real-time rendering of the scene. ...

March 29, 2024 · 6 min · Theme PaperMod

Blur with focal target & first person lightning

This demo delves into a WEBGL p5 sketch that builds upon the blur effect with added features. It introduces uniformsUI for interactive shader uniform variables setting (here just the blur intensity), incorporates a focal target defined by the scene’s sole sphere (amidst random toruses and boxes) for enhanced visual depth, and employs first-person directional light to improve immersion. It also showcases the applyShader function, demonstrating its role in applying and managing custom shader effects within the sketch. ...

March 23, 2024 · 5 min · Theme PaperMod

Visualizing Perspective Transformation to NDC

Perspective projection is a fundamental concept in 3D graphics. This transformation, akin to morphing a view frustum shape into a cube—known as Normalized Device Coordinates (NDC)—, generates a realistic foreshortening effect when applied to all scene vertices. The visualization below showcases this transformation on a set of cajas, rendered with a custom shader for shape morphing and viewed from a third-person perspective using a secondary camera, which enables the display of the main view frustum. ...

March 1, 2024 · 4 min · Theme PaperMod

Post-effects

Post-effects significantly enhance visual rendering, enabling the interactive blending of shader effects like bloom, motion blur, and ambient occlusion into rendered scenes. This demo showcases the blend of blur, noise and pixelate effects using frame buffer objects (FBOs) and WEBGL2 shaders, applied to a scene featuring randomly placed toruses and boxes, with a sphere acting as the dynamic focal point for the blur effect, thereby creating a visually engaging experience. By employing a user-space array, these effects are sequentially applied to a source FBO layer with the applyEffects(layer, effects, uniforms, flip) function. ...

February 20, 2024 · 8 min · Theme PaperMod