Featured

Selected Work & Writing

Interactive systems, tools, and experiments.

📌 Pinned

I'm Jorge Arévalo. I’ve worked in electronics engineering, software development, illustration… now I’m exploring physical and industrial design. Titles aside, I like building things that move, respond, and tell a story.

I design and build interactive experiences that blend software, electronics, and physical objects. My work lives at the intersection of mechanics, visuals, and storytelling – from kinetic toys and AR-driven devices to immersive installations.

My background spans electronics engineering, CAD modeling, software development, illustration, and system design. I integrate these disciplines to turn concepts into tangible, interactive, and visually engaging objects.

I’m particularly drawn to projects that challenge the line between physical and digital: objects that move, respond, and tell a story. Currently exploring AR, interactive installations, and modular devices that blend human experience with electromechanical systems.

Portfolio & Work:

Illustration Portfolio

GitHub – Projects and Prototypes

LinkedIn – Professional Background

Collaborate / Contact:

Loading MDX...

Or leave a message:

Loading MDX...

About this website

I'll be writing a series of articles about it, here

Resume

My resume now lives in one dedicated page with the latest PDF, JSON data, and form links.

Loading MDX...

I keep circling the idea of building physical stuff. Robots, furniture, interactive pieces. Things that live in the real world. For a while, the gap between a design in my head and a physical object felt huge. So I started stitching together a workflow, a set of tools that would let me move from a precise, parametric design to a rendered, interactive showcase.

The Workflow: A Look at Each Step

FreeCAD: For Parametric Design

It starts with FreeCAD. I needed something that understood that dimensions change. That a hole might need to be 2mm wider, or a support beam 10mm taller. Parametric design is the key. It lets me build models that are adaptable, that can be tweaked and adjusted without starting from scratch. It’s the foundation for anything I want to eventually build.

Blender: For Rendering and Logic

Next, I bring the parts into Blender. It’s not just for making things look good, though it does that well. Blender is where I prepare the scene for the web. More importantly, it’s where I can attach logic to the models. Using custom properties, I can tag an object to define its physical behavior in the game engine. A platform can be marked as 'world' geometry, a trigger volume as a 'zone'. This is where a static model starts to become an interactive object.

This App: The Interactive Showcase

Finally, everything lands here, in this web app. It’s more than just a gallery. It’s a place where I can run the 3D scenes, where the physics I defined in Blender come to life. It's the portfolio I can show to clients, a living demonstration of what I can do, from design to code.

In Practice: The Marble Drop Demo

This little demo is a concrete example of the workflow. A ball appears at a spawn point, then gravity takes over. It tumbles down a series of platforms. Each platform's behavior is controlled by those custom properties I set in Blender. One might be a simple static platform, another might be a trigger that ends the game. It’s a simple concept, but it proves out the entire pipeline.

Loading MDX...

Next Steps

This is the start. With this workflow, I can quickly prototype and showcase ideas for interactive objects and installations. The next step is to use this for actual client work, to build things that people can touch and interact with, not just on a screen.

📌 Pinned

An interactive tool that evaluates stroke consistency, pressure, and tilt in real time. Designed to make repetitive motor practice measurable and slightly less boring.

Loading MDX...

This project explores whether line quality can be trained deliberately rather than intuitively. The app analyzes stroke smoothness, direction stability, and pressure variation, then scores performance based on consistency. It turns a traditionally tedious warm-up exercise into a measurable feedback loop.

The goal isn’t to teach anatomy or composition. It’s to isolate motor control and make improvement observable.

A tiny experiment: model a simple decision flow as a diagram, then encode it directly in code. Nothing ambitious. Just clarity.

I’ve been thinking about how often I build systems that are more complicated than they need to be.

Sometimes the cleanest thing is just a state machine. A few states. A few transitions. No abstractions pretending to be frameworks.

Here’s a small one. Nothing dramatic. Just a post that can be in draft, published, or archived.

Rendering Mermaid...

That’s it. Three states. Three transitions. No hidden logic.

Now the code version:

typescript
type PostState = "draft" | "published" | "archived"

function transition(current: PostState, action: string): PostState {
  switch (current) {
    case "draft":
      if (action === "publish") return "published"
      break
    case "published":
      if (action === "archive") return "archived"
      break
    case "archived":
      if (action === "restore") return "draft"
      break
  }
  return current
}

There’s something calming about this.

The diagram shows shape. The code shows constraint.

They’re the same idea expressed differently. One is spatial, the other procedural.

Most of the time when something feels messy in a system, it’s because the states aren’t clear.

Lightweight vector animations designed for fast, interactive web experiences without heavy 3D engines.

These experiments explore how far SVG can go before you need a canvas or WebGL context. The focus is on performance, clarity, and expressive motion with minimal runtime overhead.

Scroll for more