#webdev#javascript#typescript#productivity

I built a 3D weather visualizer in one night — because weather apps are boring

by Jacobo · March 2026

Let me set the scene: it's 5 AM. I'm staring at a weather app showing "14°C, rain." A number. An icon. That's it.

Weather is inherently cinematic. A thunderstorm over Miami is dramatic. Reykjavik fog is eerie. Tokyo in heavy rain has this whole blade runner vibe. Why does every weather app show me a sad little cloud emoji?

So I built City 3D Weather. Type any city, get a real-time Three.js cinematic scene of that city's actual conditions.

The problem with weather apps

Every weather app competes on data accuracy and UI polish. They're all trying to be the same thing: the fastest, cleanest way to see a number.

I wanted to go the other direction. What if the weather app was beautiful? What if you left it open because it looked good, not because you needed it?

The insight: weather data is rich. Open-Meteo gives you precipitation intensity, wind speed, cloud cover, visibility, lightning probability — more than enough to build distinct visual states. Most apps map all of that to one icon. I mapped it to a living 3D scene.

The technical core

The stack: Next.js 15 + Three.js + Open-Meteo API.

The core challenge is mapping weather codes to visual states. Open-Meteo uses WMO weather interpretation codes — 0 is clear sky, 95–99 is thunderstorm with hail. I built a classification layer:

function classifyWeather(weatherCode: number, windspeed: number): WeatherState {
  if (weatherCode === 0) return 'sunny';
  if (weatherCode <= 3) return windspeed > 30 ? 'storm' : 'cloudy';
  if (weatherCode >= 51 && weatherCode <= 67) return 'rain';
  if (weatherCode >= 71 && weatherCode <= 77) return 'snow';
  if (weatherCode >= 95) return 'storm';
  if (weatherCode >= 45 && weatherCode <= 48) return 'fog';
  return 'cloudy';
}

Then each state has its own Three.js scene: particle systems for rain/snow, directional light intensity for sun vs. overcast, fog density, and a lightning flash loop for storms.

The city itself is procedurally generated — a grid of box geometries with randomized heights and slight color variation. Simple, but with the right lighting it reads as a real cityscape.

What surprised me

Three.js particle systems are cheaper than I expected. I was nervous about performance — 10,000 rain particles sounds like a lot. In practice, with BufferGeometry and simple point materials, it runs at 60fps with no issues. The trick is reusing particle positions via a ring buffer instead of creating/destroying geometry every frame.

The hardest part was fog. Not technically — Three.js has built-in fog. The hard part was making it look like fog and not just "low visibility." I ended up with Three.FogExp2 tuned carefully per weather state. Took longer than the whole particle rain system.

Open-Meteo is genuinely great. Free, no API key, global coverage, updates every hour. I expected to spend a night fighting CORS and rate limits. Instead I had data in 10 minutes and spent the rest of the time on visuals.

What I'd do next

The obvious next step is audio. Rain sounds, wind, distant thunder — the scene already looks cinematic, adding spatial audio would make it feel like a real environment. Web Audio API can do 3D-positioned audio, which would be wild with headphones.

I'd also add time of day. Right now the scene ignores whether it's 3 PM or 3 AM in Tokyo. The lighting should change — city lights at night, harsh sun at noon. Open-Meteo provides sunrise/sunset times, so this is a real option.

And honestly? I want a native app version. The browser tab is fine, but imagine this as a macOS screensaver.

Try it

It's live. Type your city. See what happens.

If you're in a city that's currently doing something interesting (that snowstorm in Oslo, I see you), I'd love to see a screenshot. Built in one night. 5 AM energy. Sometimes that's all it takes.