← All tools JSON

JSON Size Analyzer

Paste any JSON to instantly see its byte size, minified size, gzip estimate, structure depth, type breakdown, and heaviest keys. Everything runs in your browser.

100% client-side

All analysis runs entirely in your browser. Your JSON data is never sent to any server. Gzip size is an estimate based on character entropy.

Why JSON Payload Size Matters

Every byte in an API response adds latency. On a mobile connection averaging 5 Mbps, a 500 KB JSON payload takes 800 ms to download before your application can even begin parsing it. Multiply that by dozens of API calls per page, and payload bloat becomes the single biggest bottleneck for perceived performance.

Understanding where those bytes come from is the first step to optimization. A size analyzer breaks down your JSON into its structural components, letting you identify oversized fields, redundant nesting, and compression opportunities before they reach production.

The Three Sizes You Need to Know

B Raw Size

The byte count of your JSON as written, including formatting whitespace. This is what sits on disk or in your editor.

M Minified Size

All whitespace stripped. This is the payload size most APIs actually transmit when Content-Type is application/json.

G Gzip Size

The estimated transfer size with Content-Encoding: gzip enabled. Most modern servers compress responses automatically.

Strategies to Reduce JSON Payload Size

  1. 1

    Use shorter key names. In a list of 10,000 objects, renaming "customerAccountNumber" to "acctNo" can save tens of kilobytes. Field mapping on the client restores readability without the size penalty.

  2. 2

    Eliminate null and empty fields. Omitting keys with null values instead of including them reduces every object in the payload. Use sparse representations and let the client default missing fields.

  3. 3

    Paginate large datasets. Instead of returning 10,000 records, send 50 per page. This reduces initial payload size dramatically and lets the client fetch more only when needed.

  4. 4

    Flatten nested structures. Deeply nested objects increase key repetition overhead. Normalizing relationships (like a relational database) with ID references often produces smaller, more cacheable payloads.

  5. 5

    Enable gzip or Brotli compression. Server-side compression is the single most impactful optimization. Gzip typically reduces JSON by 60-80%, and Brotli achieves even better ratios with similar decompression speed.

Understanding Structure Depth and Complexity

The structure statistics reveal the internal complexity of your JSON. High nesting depth can indicate overly complex data models that are difficult to maintain and expensive to parse. A large number of string values relative to numbers often means there are opportunities to use enums or numeric codes instead.

The top largest keys chart shows where the most bytes are concentrated. If a single key accounts for more than half the total size, that field is a prime candidate for lazy loading, separate API endpoints, or compression. Common culprits include base64-encoded images, long descriptions, and embedded HTML content.

Frequently Asked Questions

How is JSON size calculated?

JSON size is calculated using the Blob API which returns the accurate byte length of the string encoded in UTF-8. This accounts for multi-byte characters like emoji and non-Latin scripts, giving a true network transfer size rather than just the character count.

What is the difference between minified and gzipped JSON size?

Minified JSON has all unnecessary whitespace removed but remains plain text. Gzipped JSON is further compressed using the DEFLATE algorithm, which exploits repeated patterns in the data. Typically, gzip reduces minified JSON to 30-60% of its original size depending on content repetitiveness.

Why should I care about JSON payload size?

JSON payload size directly impacts API response time, bandwidth costs, mobile performance, and Core Web Vitals scores. Large payloads increase Time to First Byte, slow down JavaScript parsing, and consume user data on metered connections.

How accurate is the gzip size estimate?

The gzip estimate is a heuristic based on character entropy analysis. It provides a reasonable approximation but is not exact because true gzip compression depends on specific byte patterns and the DEFLATE algorithm. For precise values, use your server or a command-line tool.

What is JSON depth and why does it matter?

JSON depth is the maximum level of nesting in the structure. Deeply nested JSON can cause stack overflows in recursive parsers, makes data harder to query, and often indicates overly complex data models that should be flattened or normalized.

Related Tools