The fog estimator
The most-requested feature since v0.1 has been some kind of fog forecast. I held off because I didn't want to just wrap a weather API that already does this — I wanted to understand the model well enough to explain it.
What I landed on is a four-factor scoring function that runs on top of the Open-Meteo Marine API data:
- Dew-point spread — the difference between air temperature and dew point. When this drops below about 2°C, saturation is imminent.
- Marine boundary layer height — lower ceilings mean fog is more likely to persist rather than lift.
- Coastal upwelling index — cold upwelled water cools the surface air, which accelerates saturation near the coast.
- 10m wind speed and direction — onshore flow under 8 kn at night is a strong fog precursor on west-facing coasts.
Each factor contributes a weighted score from 0–10. The weights were calibrated against three months of ASOS surface observations at SFO, OAK, and HMB — which happen to be three of the foggiest airports in the continental US, so there was plenty of training data.
It's not going to out-predict a numerical weather model. But for a quick "should I bother driving out there this morning" judgment, it's been accurate in my informal testing — I've been running the pre-release build since February and I'd say it gets the burn-off time right to within 45 minutes most days.
Stable --json schema
I've added --json to all three main subcommands (tides, fog, surf), and I'm committing to schema stability for the rest of the 0.x lifecycle. This means:
- Existing fields won't be removed or renamed in patch and minor releases.
- New fields may be added (they'll be in the changelog).
- The schema version is included in the output as
_schemaso you can gate on it.
The tides JSON output looks like this:
{
"_schema": "zeemist.tides.v1",
"station_id": "9414290",
"station_name": "San Francisco, CA",
"generated_at": "2026-04-28T07:12:03Z",
"unit": "ft",
"predictions": [
{ "t": "2026-04-28T05:14:00Z", "type": "H", "v": 5.8 },
{ "t": "2026-04-28T11:42:00Z", "type": "L", "v": 0.3 },
...
]
}
If you're piping to Waybar or a tmux status bar, this is the interface to use. I'll write a separate post on integration examples once I have a few more.
Cache warm path speedup
On a warm cache (i.e., data already fetched within the TTL), Zeemist was doing unnecessary work: it was deserializing the full response from disk just to check whether it was stale. That's now fixed with a lightweight metadata sidecar file that stores only the fetch timestamp and a content hash.
Result: cold start on a warm cache dropped from ~22ms to ~9ms on my M2 MacBook Air. Not life-changing, but it was embarrassing overhead for a tool that's supposed to feel instant.
The async-trait refactor
This is the boring part that took the most time. The original fetch layer used the async-trait crate to define a DataSource trait. Rust 1.75 stabilized async functions in traits natively, which made async-trait unnecessary — but migrating off it exposed a subtle lifetime issue in the NOAA CO-OPS client that I hadn't caught before because async-trait was boxing futures and hiding the problem.
I ended up with a deadlock under load when two fetch requests for the same station arrived simultaneously: both would check the cache (miss), both would start a fetch, and the second one would block on a mutex that the first one held while awaiting an HTTP response. The fix was a per-station request coalescing map — if a fetch is already in flight for a given station ID, subsequent requests wait for it to complete rather than starting their own.
This is only observable under load (e.g., if you're calling Zeemist from a shell script that fires multiple parallel requests), but it was a real bug and I'm glad the refactor forced me to find it.
Upgrading
# If you installed via the shell script
curl -sSL https://zeemist.cfd/install.sh | sh
# If you installed via Cargo
cargo install zeemist --force
SHA-256 checksums for the v0.4.2 binaries are on the changelog page.
Thanks to everyone who filed issues and sent patches. The fog estimator idea came from a conversation with F. Meriwether, who got annoyed that his tide-watching trips to Rodeo Beach kept getting cancelled by fog he hadn't seen in the forecast. Same, honestly.