Offline Navigation for PWA Micro Apps: Caching Map Data for Non-Developers
PWAmapstutorial

Offline Navigation for PWA Micro Apps: Caching Map Data for Non-Developers

ccached
2026-01-31
12 min read
Advertisement

Step-by-step PWA recipe to cache tiles, routes, and search results offline using service workers and IndexedDB — no heavy dev work.

Ship reliable offline navigation for micro apps — without being a backend engineer

If you manage micro apps, run internal nav tools, or empower non-developer contributors, you know the pain: maps that fail when connectivity drops, unpredictable cache freshness, and heavy CDN bills when people pre-download tiles. This recipe shows a practical, no-code-friendly path to a tiny PWA navigation micro app that caches map tiles, routes, and search results for offline use using service workers and IndexedDB — plus notes on headers, Redis, and Varnish for predictable freshness and low cost.

Why this matters in 2026

As of late 2025 and into 2026, two trends make offline-first micro apps critical: widespread adoption of edge-first architectures and mainstreaming of low-code/vibe-coding tools that let non-developers build personal or team apps quickly. At the same time, vector tiles and client-side renderers (MapLibre, smaller Mapbox alternatives) have made on-device rendering cheaper. The result: tiny navigation micro apps can offer near-native offline experience with small, predictable infrastructure costs — if you design caching correctly.

What you'll build (overview)

  • A PWA micro app shell (manifest + service worker) that installs on a device.
  • A service worker strategy: Cache Storage for tiles & assets, and IndexedDB for routes/search results and metadata.
  • “Download area” UI that prefetches tiles in a bounding box and stores them efficiently.
  • Server-side guidance: headers, Redis for origin caching, and Varnish/CDN purge strategy for cache correctness.

Design principles

  • Cache what you need: tiles and last-used routes; avoid attempting full offline routing unless you control the device footprint.
  • Make freshness predictable: use versioned caches and server Cache-Control/ETag headers.
  • Limit downloads: cap zoom levels and tile counts during prefetch.
  • Fail gracefully: provide fallbacks and informative UI when resources are missing.

Step 0 — Prerequisites (no-code friendly)

  • A PWA template (PWA Builder, Glitch, or StackBlitz). These let non-devs start a PWA and paste a small service worker.
  • Choose a tiles source: prefer MapTiler, Mapbox (observe quota), or a self-hosted tileserver. IMPORTANT: read provider terms: many prohibit bulk offline caching without a license.
  • For routing/search: either use a hosted API (with rate-limits) or your organization's small origin (Node/Flask) backed by Redis for caching responses.

Step 1 — Basic PWA shell and service worker registration

Add a minimal manifest.json and register a service worker. Non-developers can paste these into a PWA template editor.

// register-sw.js
if ('serviceWorker' in navigator) {
  navigator.serviceWorker.register('/sw.js').then(() => console.log('SW registered'));
}

Manifest (simple): name, icons, start_url, display=standalone. This enables install and offline install behavior in mobile browsers.

Step 2 — Service worker: caching policy and fetch handler

We'll use two caches: one for static app shell and one for map tiles. Routes/search responses are stored in IndexedDB (structured, queryable). The service worker will implement:
Stale-while-revalidate for tiles (fast load, background update) and Network-first with fallback for API routes (try network when online; fallback to cached route in IndexedDB if offline).

// sw.js (simplified)
const APP_CACHE = 'app-shell-v1';
const TILE_CACHE = 'tiles-v1';

self.addEventListener('install', event => {
  event.waitUntil(
    caches.open(APP_CACHE).then(cache => cache.addAll(['/','/index.html','/styles.css','/app.js']))
  );
  self.skipWaiting();
});

self.addEventListener('activate', event => {
  event.waitUntil(self.clients.claim());
});

self.addEventListener('fetch', event => {
  const url = new URL(event.request.url);

  // Map tiles (raster or vector) pattern e.g. /tiles/{z}/{x}/{y}.png
  if (url.pathname.startsWith('/tiles/')) {
    event.respondWith(tileHandler(event.request));
    return;
  }

  // API requests (routing/search)
  if (url.pathname.startsWith('/api/')) {
    event.respondWith(apiHandler(event.request));
    return;
  }

  // Default: app shell cache first
  event.respondWith(
    caches.match(event.request).then(resp => resp || fetch(event.request))
  );
});

async function tileHandler(req) {
  const cache = await caches.open(TILE_CACHE);
  const cached = await cache.match(req);
  // Serve cached immediately; meanwhile refresh in background
  const fetchAndCache = fetch(req).then(res => {
    if (res.ok) cache.put(req, res.clone());
    return res;
  }).catch(()=>null);
  return cached || fetchAndCache || new Response(null, {status: 504});
}

async function apiHandler(req) {
  try {
    const networkResp = await fetch(req);
    // Optionally clone and update IndexedDB via postMessage or by clients
    return networkResp;
  } catch (err) {
    // offline: try to return a cached response from IndexedDB via respondWithStream
    return new Response(JSON.stringify({error:'offline'}), {status: 503, headers:{'Content-Type':'application/json'}});
  }
}

This is intentionally compact. In production, add proper error handling and content negotiation for vector vs raster tiles.

Step 3 — IndexedDB for routes and search results

Use a tiny helper like idb (simple one-file library) to avoid complex boilerplate. IndexedDB stores JSON route objects and search results with timestamps and TTL.

// idb-helpers.js (concept)
import {openDB} from 'idb';
const dbPromise = openDB('nav-db', 1, {upgrade(db){
  db.createObjectStore('routes', {keyPath:'id'});
  db.createObjectStore('search', {keyPath:'q'});
}});

export async function saveRoute(id, routeObj) {
  const db = await dbPromise;
  await db.put('routes', {...routeObj, id, ts: Date.now()});
}

export async function getRoute(id) {
  const db = await dbPromise;
  const r = await db.get('routes', id);
  if (!r) return null;
  // TTL 24h by default
  if (Date.now() - r.ts > 24*3600*1000) return null;
  return r;
}

For non-developers using editors, paste these helpers into a single file and import them into your app logic. The pattern is straightforward: when a route API returns, call saveRoute(id, response). When offline, call getRoute(id) to show the last-known route.

Step 4 — Download area: compute tiles and prefetch

The UX: user taps “Download area” and specifies a bounding box and zoom levels. The app computes XYZ tiles to fetch and enqueues them to the service worker, which stores them in the tile cache.

Tile math (small helper)

function lng2tile(lng, zoom) { return Math.floor((lng + 180) / 360 * Math.pow(2, zoom)); }
function lat2tile(lat, zoom) {
  const rad = lat * Math.PI / 180;
  return Math.floor((1 - Math.log(Math.tan(rad) + 1/Math.cos(rad)) / Math.PI) / 2 * Math.pow(2, zoom));
}

function tilesForBbox(minLat, minLng, maxLat, maxLng, zoom) {
  const xMin = lng2tile(minLng, zoom);
  const xMax = lng2tile(maxLng, zoom);
  const yMin = lat2tile(maxLat, zoom); // note lat order
  const yMax = lat2tile(minLat, zoom);
  const tiles = [];
  for (let x=xMin; x<=xMax; x++) for (let y=yMin; y<=yMax; y++) tiles.push({z:zoom,x,y});
  return tiles;
}

Important: limit zoom range and total tile count. Example: bounding box for a small town at zooms 12–15 often yields a few hundred tiles; anything over ~2000 tiles will be large on mobile.

Prefetch algorithm (client-side)

  1. Compute tile list for chosen zooms.
  2. Chunk into batches (e.g., 10 concurrent fetches) to avoid saturating network.
  3. Fetch tile URLs via fetch() and rely on the service worker to cache them (or manually put into Cache API).
  4. Update progress and allow cancel/resume.
async function prefetchTiles(tiles, concurrency=10) {
  let i=0;
  async function worker() {
    while (i < tiles.length) {
      const idx = i++;
      const t = tiles[idx];
      const url = `/tiles/${t.z}/${t.x}/${t.y}.png`;
      try { await fetch(url); } catch(e){ console.warn('tile failed', url); }
    }
  }
  await Promise.all(new Array(concurrency).fill(0).map(worker));
}

In many browsers the service worker intercepts the tile fetch and stores the tile in the cache. That minimizes explicit cache API calls in the UI code and keeps the logic centralized in sw.js.

Step 5 — Offline routing strategy (practical options)

Full client-side routing (download entire routable graph) is heavy and rarely suitable for micro apps. Use pragmatic alternatives:

  • Cache-last-route: when the user requests a route, store the route geometry & instructions in IndexedDB. If offline later, serve that route (good for planned trips).
  • Prefetch planned routes: if user plans a route while online (e.g., set waypoints), prefetch and save it.
  • Edge-assisted offline: run a small OSRM/GraphHopper instance at your origin and cache responses in Redis for repeat requests; rely on cached responses when offline.
  • On-device turn-by-turn (advanced): only for teams that can bundle Graph data into small MB-level extracts and integrate client routing libs — not covered here.

For micro apps, the best tradeoff is last-route caching and proactive prefetch for user-marked areas.

Step 6 — Server-side: headers, Redis, Varnish and cache invalidation

Good server headers make client caching predictable and keep CDN/bandwidth cost down.

Tile responses

  • Set Cache-Control: public, max-age=86400, stale-while-revalidate=86400 for tiles you control and are allowed to cache.
  • Include strong ETag or Last-Modified headers for safe revalidation.
  • If using Varnish in front of origin, set appropriate vcl to cache tiles and respect purge/ban commands on deploy.

Routing / Search API

  • Cache computed routes in Redis keyed by parameters (coords + profile). Store TTL suitable to your freshness needs (e.g., 1h for routing, 24h for POI search results).
  • Set HTTP headers: Cache-Control: public, s-maxage=3600 for CDN/edge; vary by query parameters only if needed.

Cache invalidation (CI/CD)

  1. On tile/style update, bump the service worker cache name (tiles-v2) and include a new manifest asset so clients re-install.
  2. Use automated purge: CI triggers Varnish BAN/PURGE and CDN cache purge for updated tile paths.
  3. Flush related Redis keys for route/search when upstream data changes (e.g., map corrections).

This pattern ensures clients receive a new SW on next load while still using cached tiles until the new SW claims clients — predictable and transparent.

Operational notes and benchmarks

Practical numbers help non-developers pick defaults. Typical web raster tiles are 10–60KB each; vector tiles are often smaller (5–20KB). If you prefetch 500 raster tiles at 30KB average, expect ~15MB download. For mobile-first micro apps, aim for <50–100MB user downloads.

  • Prefetch budget: 50MB default (approx. 1000 vector tiles or 300 raster tiles).
  • Zoom levels: 12–15 for town-level navigation; 10–14 for regional drive routes.
  • Concurrency: 6–12 parallel fetches to balance battery and throughput.

Security, licensing and privacy considerations

  • Follow tile provider licensing: many providers restrict offline caching. Use vendors with explicit offline licenses or self-hosted tiles.
  • Respect user privacy: do not store PII in IndexedDB without consent. If storing recent searches, allow users to clear local storage easily.
  • Verify TLS for API & tiles. Service workers obey same-origin rules unless CORS is configured properly for cross-origin tile hosts.

Advanced tips and future-proofing (2026+)

  • Vector tiles + client rendering: increasingly common in 2026. Vector tiles reduce bandwidth and let you ship smaller style updates; cache them the same way you cache raster tiles but expect slightly different mime-types (application/x-protobuf for pbf).
  • Edge compute for routing: push route calculations to edge functions and cache responses in Redis-compatible stores for ultra-low latency and easier scaling.
  • Push-style updates: use a light push or periodic background sync to refresh key tiles/routes when device is on Wi‑Fi to keep offline data fresh while avoiding cellular cost.
  • Telemetry and usage limits: track how many tiles users prefetch — this controls provider billing. Build guardrails in the UI to prevent accidental large downloads. For observability and incident playbooks see resources on observability and incident response.

Troubleshooting — common problems and fixes

Tiles not showing offline

  • Check service worker registration and scope.
  • Inspect Cache Storage in DevTools to confirm tiles are present.
  • Confirm Cache-Control headers allow caching (for third-party tiles, CORS and headers matter).

Routes missing when offline

  • Ensure route responses are saved into IndexedDB after fetch. Use postMessage to notify the SW or a direct client call to persist.
  • Check TTL logic — you may be expiring routes too aggressively.

Large downloads or unexpected bills

  • Enforce prefetch caps and show accurate download size estimate before confirmation.
  • Prefer vector tiles and narrower zoom ranges for smaller footprint.

Mini case study (real-world pattern)

A field ops team in late 2025 built a micro app used by 50 technicians to reach sites with intermittent connectivity. They chose MapTiler with an enterprise offline license, implemented tile prefetch for project sites (zoom 13–15), and cached the last 5 routes per user in IndexedDB. On average they reduced the team's cellular data by 70% and cut emergency support calls by 40% because technicians could continue following the last route even without connectivity. For on-site power and field gear tradeoffs, teams often consider portable power options such as the X600 portable power station when planning device fleets.

Starter checklist for non-developers (copy-paste actions)

  1. Pick a PWA template editor (PWA Builder, Glitch).
  2. Choose tile provider and confirm offline caching license.
  3. Paste service worker (sw.js) and register script into the template.
  4. Paste idb helper and small UI script to trigger prefetch and save routes.
  5. Set server Cache-Control headers (if you manage origin): tiles 24h + stale-while-revalidate; APIs s-maxage 1h.
  6. Test offline: install the PWA, prefetch a small area, go into airplane mode, and verify tiles & last route appear.

Actionable takeaways

  • Start small: implement last-route caching and tile prefetch for a single site before expanding.
  • Make caching explicit: show download size and status in the UI so users consent to offline storage.
  • Use versioned caches and CI purge: bump cache names on style or tile updates so clients get predictable refreshes.
  • Protect costs: enforce tile count/zoom caps and prefer vector tiles for smaller downloads.

Where to go next

If you want a working starter: create a PWA template, add the provided sw.js and idb helpers, and experiment with a small bounding box at a couple of zoom levels. If you run into provider licensing complexity, consider running a low-cost tileserver (tileserver-gl or MapTiler Cloud with an offline license) so you control caching rules.

"Make offline experience predictable: explicit downloads, versioned caches, and clear TTLs keep users happy and your costs manageable."

Call to action

Ready to build your own offline navigation micro app? Start with a PWA template now: paste the service worker and idb helpers, configure a small tile area, and test in airplane mode. If you want a checklist or a tailored recipe for your team's fleet size, respond with your constraints (tile provider, expected tile budget, and offline routing needs) and I'll produce a customized setup plan and CI/CD purge scripts you can copy into your pipeline.

Advertisement

Related Topics

#PWA#maps#tutorial
c

cached

Contributor

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
2026-01-31T02:16:27.412Z