DEV Community

John D
John D

Posted on

I Built 75 Browser-Based Tools With Next.js and Zero Backend.

No waitlists. No accounts. No paywalls. Just tools that work.

I kept running into the same problem: needed a quick text formatter, word counter, or list sorter — and every existing option was bloated, ad-heavy, or demanded an account before it would do anything.

So I built usefmtly.com — a collection of 75 browser-based tools covering text formatting, converters, random generators, list tools, and code utilities. The idea was straightforward: make useful tools that open instantly, solve one problem well, and don't demand an account, a trial, or your data before they do their job.

That meant a few constraints from the start:

  • no login
  • no backend dependency for core functionality
  • no "free trial"
  • just open the page and use the tool

Everything runs client-side. Nothing leaves your browser.

Here's how I built it, what the stack looks like, and what I learned shipping it this way.


Why 75 Tools?

Partly because I like small utilities, and partly because this kind of product maps really well to search.

The model is simple:

  • useful tools capture action-oriented intent
  • long-form guides capture informational intent
  • internal links connect the two

For example:

  • a tool targets something like "linkedin text formatter"
  • a guide targets something like "how to bold text on LinkedIn"

Together they cover both "I need this now" and "I want to understand how this works."

I focused on niches where:

  • monthly search volume was roughly 1k–500k
  • keyword difficulty was under 20
  • existing results were weak, outdated, or overloaded with ads

Then I built the tool, paired it with a guide when it made sense, and moved on.


The Stack

The stack is intentionally minimal.

Next.js 15 + React 19
Static export using output: 'export'. No server, no API routes, no database. Every page is deployed as static HTML.

Tailwind CSS
Shared design system and reusable components. Dark mode by default.

TypeScript
Typed tool registry, typed configs, typed component props. I wanted adding new tools to feel mechanical instead of fragile.

Vercel
Deploy from GitHub on every push.

That's it.

The simplicity is the point: it keeps the site fast, cheap to run, and easy to maintain.


How the Tool System Works

Every tool is registered in a central registry:

{
  slug: 'linkedin-text-formatter',
  name: 'LinkedIn Text Formatter',
  category: 'text-tools',
  description: 'Bold, italic & 20+ Unicode styles for LinkedIn posts.',
  keywords: ['linkedin formatter', 'bold text linkedin'],
  relatedTools: ['fancy-text-generator', 'small-text-generator'],
}
Enter fullscreen mode Exit fullscreen mode

From the registry, the site auto-generates:

  • the tool page at /tools/[category]/[slug]
  • sitemap entry
  • llms.txt
  • related tools blocks
  • category hub pages

So adding a new tool usually means:

  1. write the engine logic
  2. add the registry entry
  3. build the client component
  4. write the article content
  5. deploy

I kept it to one Linear ticket per tool, which made the work much easier to manage.


The Article Pattern That Saved Performance

Each tool page includes an embedded SEO article, usually around 400–600 words covering what the tool does, when to use it, and related concepts.

My early mistake was putting the article component inside a 'use client' file.

The fix was simple: keep the article as a server component and pass it into the interactive client component as a prop.

// page.tsx (server component)
export default function Page() {
  return (
    <FormatterToolClient article={<LinkedInFormatterArticle />} />
  )
}

// FormatterToolClient.tsx ('use client')
export function FormatterToolClient({ article }: { article?: React.ReactNode }) {
  // interactive tool logic here
  return (
    <>
      <ToolUI />
      {article}
    </>
  )
}
Enter fullscreen mode Exit fullscreen mode

That brought Lighthouse back to 100.

The lesson was obvious in hindsight: render content on the server, keep the client boundary only around the interactive bits.


Shipping 75 Tools Without Losing Your Mind

A few things made the project much easier to ship than I expected.

  1. One ticket at a time
    Each tool got its own Linear ticket, its own commit, and its own deploy. No giant "tool batch" tasks. That forced me to actually finish things.

  2. Shared components early I built reusable pieces like ToolLayout, ToolCopyButton, ThemeSwitcher, and ToolToggle early. After that, consistency came almost for free.

  3. Keyword research before implementation
    Every tool started with keyword validation. If there wasn't search demand, I didn't build it.

  4. Static export as a constraint
    Not having a backend removed a lot of tempting shortcuts. Every tool had to be genuinely browser-based.


What's Live

75 tools across 5 categories:

  • Text tools — LinkedIn formatter, word counter, case converter, glitch text, fancy text, strikethrough, and 20+ more
  • Converters — Hex/RGB/HSL, temperature, decimal/binary, number to words, Roman numerals
  • List tools — alphabetical sorter, deduplicator, merge lists, list compare, randomizer
  • Random generators — colors, names, passwords, UUIDs, animals, countries, emojis
  • Code tools — JSON/YAML formatter, HTML↔Markdown, CSV↔JSON, JWT decoder, ASCII art

I also added 18 long-form guides around adjacent informational keywords like word count, percentage calculators, lorem ipsum, and JSON formatting.


The Numbers Right Now

The site is still early.

It's launched, indexed in Google Search Console, and now I'm waiting to see how rankings settle over the next few months.

The bet is that 75 tools + 18 guides creates enough topical surface area for organic traffic to compound over time.

We'll see if that plays out.


What I'd Do Differently

A few things are already obvious.

Build the guide layer earlier
I focused on tools first and guides second. In practice, those should have been built in parallel.

Set up rank tracking on day one
I didn't, and now I'm retroactively tracking a large batch of keywords instead of watching them from launch.

Lean into internal linking sooner
Once you have dozens of tools, the structure matters almost as much as the individual pages.


Why I Like This Model

I like products that stay small and useful.

No onboarding.
No "book a demo."
No friction between intent and action.

Just pages that load fast and do one thing well.

For utility sites, that feels like the right tradeoff.


Check It Out

usefmtly.com — free, no account, no ads (for now).

If you've built anything similar — utility sites, SEO-first products, or small tools with a static architecture — I'd love to hear what worked for you.


Built with Next.js 15, React 19, Tailwind, TypeScript, and Vercel. All tools run client-side.

Top comments (8)

Collapse
 
apex_stack profile image
Apex Stack

Great question on language differentiation. The underlying financial data (price, ratios, market cap) is identical across languages — what changes is the analysis layer. Each locale gets its own templated narrative that frames the data differently based on local market context. For example, a German page for a US stock might emphasize USD/EUR exposure, while the Dutch version might reference AEX-listed peers for comparison.

The real SEO value isn't the data itself but the long-tail keyword coverage. "AAPL Aktienanalyse" has far less competition than "AAPL stock analysis" — same data, different SERP landscape.

Smart move dropping FAQPage — I hadn't caught that deprecation outside gov/health. We still have it on some pages, need to clean that up. And agreed on WebApplication over SoftwareApplication for browser tools, that's the right call.

Rank tracking from day one is the lesson I keep relearning too. We're retroactively trying to establish baselines now and it's painful — you lose the ability to measure the impact of your early optimizations.

Collapse
 
fmtly profile image
John D

That's a really smart way to handle it. Using the analysis layer to differentiate the pages makes a lot of sense if the raw financial data is the same.

Also agree about multilingual SEO — people often think it’s just translation, but the keyword landscape can be completely different. Targeting queries like "AAPL Aktienanalyse" instead of the English terms is a clever way to find less competitive SERPs.

Your point about crawl budget at scale is also something I’m starting to think about early. Static tools make it very easy to generate hundreds of pages, but keeping the overall quality high seems to be the real challenge once you pass a certain size.

Out of curiosity - at ~100k pages, how are you handling sitemaps and indexing? Splitting by locale/section or just letting Google figure it out?

Collapse
 
apex_stack profile image
Apex Stack

Great question — sitemaps are definitely something I had to think through carefully at this scale.

I split them by locale and page type. So there's a separate sitemap index for /en/stock/, /de/stock/, /nl/stock/, etc., and then separate ones for sectors, ETFs, and so on. Each sitemap stays under 50k URLs (Google's limit), and I submit the sitemap index to GSC rather than individual files.

The key lesson: don't just dump 100k URLs in one sitemap and hope Google figures it out. Segment by priority so you can track which sections are getting crawled. I found that sector pages index much faster than individual stock pages, probably because they have more internal links pointing to them and more unique content per page.

Also worth noting — hreflang annotations across 12 languages create a LOT of cross-references. I handle those in the sitemaps rather than on-page to keep the HTML clean and fast.

Thread Thread
 
fmtly profile image
John D

the hreflang-in-sitemaps approach is something i keep seeing recommended but haven't tried yet — does it actually keep the on-page HTML noticeably cleaner at scale? i can imagine 12 × 100k pages worth of hreflang annotations getting unwieldy fast in the

.

the sitemap segmentation by priority makes total sense. i've been thinking about this for when we scale past 200 tools — splitting by category so you can actually see which clusters google is crawling vs. sitting on.

Collapse
 
apex_stack profile image
Apex Stack

This is eerily similar to how I approached building a programmatic SEO site — except I went the other direction: 100k+ pages of financial data across 12 languages instead of 75 utility tools. Same core philosophy though.

Your tool registry pattern is almost identical to what I use with Astro. Central config per page type, auto-generated routes, sitemap entries, and internal linking — all from a single source of truth. Once you have that mechanical feel of "add config, deploy, done," scaling becomes a completely different game.

A few things that might help as you watch rankings settle:

Internal linking is going to be your biggest lever at 75+ pages. You mentioned leaning into it sooner — I'd go aggressive. Every tool page should link to 3-5 related tools AND the corresponding guide. Every guide should link back to the tool. Google treats that cluster structure as a topical authority signal, and with 75 tools + 18 guides you have enough mass for it to matter.

The guide layer is where your long-term traffic lives. Tools capture high-intent "do" queries, but guides capture the "learn" queries that have 5-10x the volume. The fact that you're pairing them is the right call — most tool sites skip this entirely.

Watch your crawl budget carefully. At 75 pages you're fine, but if you scale to 200+ the ratio of indexed vs. discovered-not-indexed pages becomes the metric that matters. I learned this the hard way running a much larger site — Google will crawl your pages but reject them if the content doesn't meet their quality threshold at scale.

The "keyword research before implementation" discipline is something I wish more builders followed. Building 75 tools that each target validated demand is a much better bet than building 500 tools and hoping some stick.

Really clean execution. Curious how you're handling structured data — are you using SoftwareApplication schema on the tool pages?

Collapse
 
fmtly profile image
John D • Edited

The 100k+ page financial data site sounds fascinating — especially with 12 languages. Curious how you handle differentiation between languages when the underlying data is the same, or do markets diverge enough that the pages are genuinely different?

For structured data, I’m using WebApplication schema (felt more accurate than SoftwareApplication for browser tools). Each tool page includes datePublished and dateModified in JSON-LD. I also removed FAQPage bacause Google deprecated its rich results outside gov/health.

One thing I wish I had done earlier is what you mentioned — proper rank tracking from day one. I’m now setting it up retroactively across 75 tools instead of having baseline data from launch. For the next project I’d instrument that before even indexing pages.

Collapse
 
mogwainerfherder profile image
David Russell

Slight typo: this text repeated twice:
My early mistake was putting the article component inside a 'use client' file.
My early mistake was putting the article component inside a 'use client' file.

Collapse
 
fmtly profile image
John D

Thank you!