When a recruiter searches your name on Google, what shows up defines the first impression. Controlling that is a technical skill — measurable, reproducible, documentable.
This is the first article in a 3-part series where I do exactly that live: going from zero indexation to ranking rafael cavalcanti da silva and rafaelroot on Google. Real screenshots, real metrics, no theory.
Full original article: rafaelroot.com/en/blog/seo-case-study-google-ranking-part-1/
TL;DR — Week 1 (03/08/2026)
- Baseline: 0 results for me searching "rafael cavalcanti" or "rafael cavalcanti da silva" in the top 100.
- Stack: Astro 5.17 (SSG) + Nginx with HTTP/2, Brotli, HSTS and security headers.
- Google Search Console configured, sitemap accepted: 16 URLs discovered.
- PageSpeed Mobile: 58 → 87 (+29 points) with 9 documented fixes.
- 8 fixes applied against Google's official guidelines — no guesswork.
The diagnosis (baseline on 03/08/2026)
Before any optimization, I screenshotted each target SERP. This is the zero point to compare against in the coming months.
"rafael cavalcanti"
| Position | Result | Domain |
|---|---|---|
| 1 | Rafael Cavalcanti — Bradesco (LinkedIn) | linkedin.com |
| 2 | Rafael Cavalcanti (@rafaelcavalcantig) | instagram.com |
| 3 | Rafael Cavalcanti Garcia de Castro Alves | escavador.com |
| 4 | Rafael Cavalcanti — Lawsuits | jusbrasil.com.br |
| 5 | Rafael Cavalcanti de Souza — FAPESP | bv.fapesp.br |
Diagnosis: SERP dominated by namesakes on high-authority platforms. Zero mention of rafaelroot.com.
"rafael cavalcanti da silva"
| Position | Result | Domain |
|---|---|---|
| 1 | Rafael Cavalcanti da Silva — Lawsuits | jusbrasil.com.br |
| 2 | 7 profiles with the name | br.linkedin.com |
| 3 | Who is Rafael Chocolate... | g1.globo.com |
| 4–6 | Business registrations and influencer | serasa, instagram, istoedinheiro |
Diagnosis: Moderate competition, but dominated by negative content (lawsuits, convictions) — which opens a real opportunity.
Formal baseline
╔══════════════════════════════════════════════════════════════╗
║ BASELINE — 03/08/2026 ║
╠══════════════════════════════════════════════════════════════╣
║ "rafael cavalcanti da silva" → outside top 100 ║
║ "rafael cavalcanti" → outside top 100 ║
║ "rafaelroot" → nonexistent word (vol. 0) ║
║ ║
║ Indexed pages: 0 ║
║ Search Console impressions: 0 ║
║ Organic clicks: 0 ║
║ Known backlinks: 0 ║
╚══════════════════════════════════════════════════════════════╝
The plan: 3 phases, 12 weeks
| Phase | Period | Focus |
|---|---|---|
| 1 — Foundations | Weeks 1-2 | Baseline, stack, Search Console, GA4, technical fixes |
| 2 — Content & Authority | Weeks 3-6 | Optimized articles, profiles, link building, cross-links |
| 3 — Measurement & Tuning | Weeks 7-12 | A/B title tests, real CWV, Part 3 with results |
The stack
Why Astro?
Astro SSG → Static HTML → 0kb JS by default
├── LCP: < 1.0s
├── FID: 0ms (no JS = no blocking)
├── CLS: 0 (no reflow)
└── TTFB: ~50ms (static file via Nginx)
| Framework | JS Bundle | Typical LCP | SEO Score |
|---|---|---|---|
| Astro (SSG) | 0 KB | < 1s | 100 |
| Next.js (SSR) | ~80 KB | 1.5-2.5s | 90-95 |
| Create React App | ~200 KB | 3-5s | 50-70 |
| WordPress | ~150 KB | 2-4s | 60-80 |
Sitemap with priorities and i18n (Astro)
// astro.config.mjs
import { defineConfig } from 'astro/config';
import sitemap from '@astrojs/sitemap';
export default defineConfig({
site: 'https://rafaelroot.com',
integrations: [
sitemap({
i18n: {
defaultLocale: 'pt-br',
locales: { 'pt-br': 'pt-BR', pt: 'pt-PT', es: 'es', en: 'en', ru: 'ru' },
},
serialize(item) {
if (item.url === 'https://rafaelroot.com/') item.priority = 1.0;
else if (item.url.match(/\/blog\/.+\//)) {
item.priority = 0.9;
item.changefreq = 'monthly';
}
return item;
},
}),
],
});
Nginx: SEO-focused configuration
server {
listen 443 ssl http2;
server_name rafaelroot.com www.rafaelroot.com;
# HSTS — 1 year with preload
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" always;
# Security headers
add_header X-Content-Type-Options "nosniff" always;
add_header X-Frame-Options "SAMEORIGIN" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
# Brotli + Gzip
brotli on; brotli_comp_level 6;
gzip on;
# Aggressive cache for immutable assets
location ~* \.(css|js|woff2|png|webp|avif|svg|ico)$ {
expires 1y;
add_header Cache-Control "public, immutable";
}
# Always fresh HTML
location ~* \.html$ {
expires 1h;
add_header Cache-Control "public, must-revalidate";
}
}
Why each directive matters for SEO:
| Directive | SEO Impact |
|---|---|
http2 |
Multiplexing → lower TTFB → better LCP |
brotli |
~20% smaller than gzip → reduced transfer size |
expires 1y + immutable |
Eliminates re-downloads → better FCP |
HSTS |
Google favors HTTPS → ranking signal |
Google Search Console ✅
Sitemap accepted on 03/08/2026:
| Metric | Value |
|---|---|
| Sitemap |
sitemap-index.xml → sitemap-0.xml
|
| Status | Success |
| Discovered URLs | 16 (5 homepages + 5 blog indexes + 6 articles) |
9 documented technical fixes
🚨 Fix 1 — robots.txt was blocking Googlebot's CSS ✅
The problem: Disallow: /_astro/ prevented Googlebot from accessing Astro's bundled CSS. Google was seeing raw HTML without styles — this can silently hurt your ranking.
Google's own documentation states:
"For this, Google needs to be able to access the same resources as the user's browser. If your site is hiding important components that make up your website (like CSS and JavaScript), Google might not be able to understand your pages."
# robots.txt — BEFORE (blocking CSS)
User-agent: *
Disallow: /_astro/ ← BLOCKED BUNDLED CSS!
# robots.txt — AFTER (correct)
User-agent: *
Allow: /
# /_astro/ removed — Googlebot needs the CSS
How to verify: Search Console → URL Inspection → "Test Live URL" → Screenshot. If CSS doesn't load, you'll see unstyled HTML.
🔧 Fix 2 — Removed <meta name="keywords"> ✅
Google explicitly states:
"Google Search doesn't use the keywords meta tag."
Besides being useless, it exposes your keyword strategy to competitors. Removed from both layouts.
🌐 Fix 3 — hreflang on blog articles ✅
The homepage had correct hreflang for 5 languages. Blog articles had none. I implemented a translationKey system in the frontmatter:
# article frontmatter
translationKey: "reverse-engineering-android-frida"
Articles sharing the same translationKey are automatically cross-linked at build time:
<!-- Generated output for the PT-BR article -->
<link rel="alternate" hreflang="pt-BR" href="https://rafaelroot.com/blog/engenharia-reversa-android-frida/" />
<link rel="alternate" hreflang="en" href="https://rafaelroot.com/en/blog/reverse-engineering-android-frida/" />
<link rel="alternate" hreflang="x-default" href="https://rafaelroot.com/blog/engenharia-reversa-android-frida/" />
Without this, Google may index only one version and treat the other as duplicate content.
⚡ Fix 4 — CSS/Fonts no longer block rendering ✅
The problem: Google Fonts and devicon.min.css loaded synchronously in <head> were blocking First Contentful Paint. On slow mobile, this cost up to 4,570ms.
<!-- BEFORE (render-blocking) -->
<link href="https://fonts.googleapis.com/..." rel="stylesheet" />
<!-- AFTER (non-blocking) -->
<link rel="preload" as="style" href="https://fonts.googleapis.com/..."
onload="this.onload=null;this.rel='stylesheet'" />
<noscript><link rel="stylesheet" href="https://fonts.googleapis.com/..." /></noscript>
The browser paints immediately with system fonts and swaps when custom fonts load — without blocking FCP.
🎨 Fix 5 — AA contrast fixed ✅
The --muted color (#5a7a9a) against background #040a12 had a ratio of 4.45:1 — below the WCAG AA minimum of 4.5:1.
/* BEFORE */
--muted: #5a7a9a; /* 4.45:1 — FAILS AA */
/* AFTER */
--muted: #7090b0; /* 5.96:1 — PASSES both: bg and glass nav */
📱 Fix 6 — Mobile performance: content-visibility ✅
Google uses Mobile-First Indexing — the mobile score is what affects your ranking. Our page has 9 sections but the browser was rendering all of them at once.
/* Skip rendering for off-screen sections */
section {
content-visibility: auto;
contain-intrinsic-size: auto 600px;
}
/* Disable expensive GPU effects on mobile */
@media (max-width: 768px) {
.nav-outer { backdrop-filter: blur(8px); } /* was 16px */
.btn, .hero-badge, .bio-grid { backdrop-filter: none; }
.hero-badge { animation: none; }
}
🔤 Fix 7 — CLS: font fallbacks with size-adjust ✅
font-display: swap caused layout shift when Inter and JetBrains Mono replaced system fonts. Fix: declare @font-face fallbacks with matching metrics.
@font-face {
font-family: 'JetBrains Mono Fallback';
src: local('Courier New');
size-adjust: 108%; /* calibrated metrics for zero reflow */
}
@font-face {
font-family: 'Inter Fallback';
src: local('Arial');
size-adjust: 100%;
}
/* Updated stacks */
--mono: 'JetBrains Mono', 'JetBrains Mono Fallback', monospace;
--sans: 'Inter', 'Inter Fallback', system-ui, sans-serif;
🏷️ Fix 8 — Optimized titles ✅
Problem 1 — Truncation: Titles at ~87 chars. Google truncates at ~55-60 chars.
Problem 2 — Redundant site name: Since we have WebSite schema with name: 'rafaelroot.com', Google already shows the site name separately in SERPs. Including it in <title> wasted 18 chars.
Problem 3 — Mixed delimiters: | and — in the same title.
# BEFORE (87 chars, truncated)
Rafael Cavalcanti da Silva | Fullstack Developer & Security Specialist — rafaelroot.com
# AFTER (57 chars, fully visible)
Rafael Cavalcanti da Silva — Fullstack Developer & Security
📄 Fix 9 — Structured data: ProfilePage + WebSite ✅
Homepage upgraded from WebPage to ProfilePage (a subtype of WebPage). The WebSite schema gained alternateName and a trailing slash on url:
// WebSite — AFTER
{
"@type": "WebSite",
"name": "rafaelroot.com",
"alternateName": ["Rafael Cavalcanti", "Rafael Cavalcanti da Silva", "rafaelroot"],
"url": "https://rafaelroot.com/"
}
// Homepage — AFTER
{
"@type": "ProfilePage",
"mainEntity": { "@id": "https://rafaelroot.com/#person" },
"dateCreated": "2026-02-01T00:00:00-03:00"
}
PageSpeed: before vs. after
| Metric (Mobile) | Before | After | Delta |
|---|---|---|---|
| Performance | 58 | 87 | +29 🚀 |
| Accessibility | 96 | 100 | +4 ✅ |
| Best Practices | 100 | 100 | — |
| SEO | 100 | 100 | — |
| FCP | 6.7s | 2.8s | -58% |
| LCP | 6.7s | 2.8s | -58% |
| Speed Index | 9.4s | 2.9s | -69% |
| TBT | 0ms | 0ms | — |
| CLS | 0 | 0.113* | *fixing with size-adjust |
Desktop stays at 90 / 96 / 100 / 100.
Going from 58 → 87 on mobile is the highest-impact SEO improvement possible: Google uses Mobile-First Indexing to determine ranking. Mobile performance = ranking.
What Google says NOT to worry about
Common myths debunked by Google's own guidelines:
| Myth | What Google says |
|---|---|
| Meta keywords | "Google Search doesn't use the keywords meta tag." |
| Keyword stuffing | Against spam policies. Write naturally. |
| Keywords in domain | "...have hardly any effect beyond appearing in breadcrumbs." |
| Word count | "The length of the content alone doesn't matter for ranking." |
| E-E-A-T as ranking factor | It's not. It's a quality evaluation framework. |
| Duplicate content penalty | "It's inefficient, but it's not something that will cause a manual action." |
Week 1 checklist
- [x] SERP baseline with screenshots and tables (03/08/2026)
- [x] Search Console configured + sitemap accepted (16 URLs)
- [x] robots.txt fixed (no longer blocks
/_astro/) - [x]
<meta name="keywords">removed from both layouts - [x] hreflang on articles via
translationKeysystem - [x] CSS/Fonts non-blocking (async preload + noscript fallback)
- [x] AA contrast:
--muted→#7090b0(5.96:1) - [x] Mobile:
content-visibility: auto+ reducedbackdrop-filter - [x] CLS:
@font-facefallbacks with calibratedsize-adjust - [x] Titles shortened (<60 chars) + single
—delimiter - [x] ProfilePage schema on homepage + WebSite schema fixed
- [x] Real link to
/blogin footer (sitelink-friendly) - [x] PageSpeed Mobile: 58 → 87 | Desktop: 90/96/100/100
- [ ] Create GA4 property + set
PUBLIC_GA_ID - [ ] Deploy Nginx with optimized cache headers
- [ ] Re-run PageSpeed after deploy (confirm CLS < 0.1)
Next steps (Part 2)
- Measure real indexation: checkpoint 03/14
- GA4 with real ID + SEO dashboard (CTR, scroll depth, time on page)
- Link building: GitHub, Stack Overflow, dev.to
- First article targeting "rafael cavalcanti developer" with internal cross-links
I'm documenting everything in real time. Follow the series and let me know in the comments: which metric do you want to track in Part 2?
Last updated: March 8, 2026 | Full article with more technical details






Top comments (11)
I must say, I appreciate the way you have explained this. At Averybit Solutions, we have seen that the biggest mistake people make is combining everything under 'SEO Wins.' The only way to know what is working is to separate technical indexation and CTR performance.
Regarding the crawl budget you mentioned in the article, this is a 'silent killer' for sites that are growing. We recently worked on a project where Google was crawling thousands of pages but not indexing them unless we improved the authority signals for the domain. Creating those 'footholds' for the site on platforms such as Dev.to is how you solve this.
And good call on the mobile LCP optimization. Even with fast frameworks, paint time is a bottleneck for slower hardware. Can't wait to see the actual numbers in Part 2!
Your team's observation about crawl budget being the "silent killer" turned out to be one of the most important signals. In Part 2 (just published), I separated results into exactly that framework — technical indexation vs. content-driven ranking vs. CTR performance. Each moved for completely different reasons.
The "authority footholds" strategy (dev.to, GitHub, LinkedIn, npm) before scaling page count was crucial. Google indexed all 16 pages in 10 days, which I attribute directly to the domain authority signals from those platforms.
On mobile LCP — you were right to flag it. Even with Astro's zero-JS output, the font loading strategy was adding 920ms to LCP. Fixed it with a CSS media-swap pattern instead of JS injection. Now at Lighthouse 100 across all four categories.
Full details in Part 2 — including the Nginx production config with Brotli, GA4 setup, and Search Console analysis showing position 100+ → position 3 in 6 weeks.
Really interesting to see someone else tracking this journey. I'm running a similar experiment with two Astro sites: one dev blog, one niche tools site. Week 1.5 in and already seeing page-1 rankings for long-tail queries, which surprised me. The zero-JS output from Astro definitely helps with Core Web Vitals, which I think gives new sites a faster path into Google's index. Curious what your Lighthouse scores look like and whether you're seeing the same pattern.
Great to hear you're running a similar experiment! The zero-JS output from Astro is definitely a huge advantage — our Core Web Vitals are essentially perfect because there's nothing to block rendering.
As of this week, Lighthouse scores are 100/100/100/100 (Performance, Accessibility, Best Practices, SEO). The key fixes that got us from 99 to 100:
Performance: The LCP element was a text element using a Google Font loaded via async JS injection. Replaced it with a CSS media="print" onload="this.media='all'" pattern — eliminated 920ms from the render chain.
SEO: Added og:image:alt and twitter:image:alt meta tags.
For page-1 rankings on long-tail queries at week 1.5, that tracks with what I saw too. The fast indexation from static HTML + clean sitemaps gives new Astro sites a real head start. Would love to compare notes as your experiment progresses.
One thing I learned the hard way — images uploaded with zero XMP metadata are basically invisible to Google Image Search. Took me too long to figure this out. Fixed it with this → prometadata.com/inject
Thanks for sharing your experience with XMP metadata. For this particular site, all content is text-based and code-focused — there are no content images to optimize for Google Image Search. The OG images are generated graphics for social sharing, so image search isn't a ranking vector here. Interesting insight for image-heavy sites though.
Case-study format is great here. One thing that would make it even stronger is splitting results into indexation / ranking / CTR / on-page performance, because those move for different reasons and it helps avoid attributing everything to one change. For developers, the useful bit is seeing which improvements were content-driven versus technical.
Thank you so much for the feedback! Really glad you enjoyed the case-study format.
You make an excellent point about breaking down the results into indexation, ranking, CTR, and on-page performance separately. That's a much more surgical way to analyze what's actually moving the needle — and you're absolutely right that it helps avoid the trap of attributing everything to a single change. For Part 2, I'll definitely structure the results this way.
And I love the distinction you highlighted: for developers, the real value is understanding which improvements were content-driven versus technical. That's exactly the kind of clarity I want to bring. The technical side (fixing robots.txt, content-visibility: auto, hreflang implementation) is where we have direct control, but the content-driven results (titles, meta descriptions, actual page copy) are often where the ranking magic happens — and separating those signals is key.
Appreciate you taking the time to read and suggest improvements. Hope to deliver a solid Part 2 that lives up to this! 🚀
Really appreciate the detailed response, Rafael. The technical vs content-driven distinction is something I wish more SEO case studies separated clearly — knowing that fixing robots.txt moved indexation but it was the meta description rewrites that moved CTR tells you completely different things about where to invest next.
The crawl budget plateau you're mentally preparing for is real. On our site we're seeing it at a much larger scale — 51k pages Google has crawled but decided not to index. The relationship between domain authority and Google's willingness to spend crawl budget is the silent variable nobody talks about. You're smart to build those "authority footholds" early before the page count outpaces the trust signals.
Definitely looking forward to Part 2, especially the GSC dashboard correlating actions with results. That kind of before/after with real data is what makes technical SEO content actually useful versus the typical "follow these 10 steps" posts.
Really solid methodology here — love that you're documenting the baseline with actual screenshots and SERP positions instead of vague "we improved things" claims.
The robots.txt blocking
/_astro/is a trap I've seen catch a lot of Astro sites. I run a multilingual Astro site with 100k+ programmatic pages across 12 languages, and the hreflang implementation you described withtranslationKeyis almost exactly the pattern I use. The build-time cross-linking approach is the right call — doing it at runtime or via a plugin introduces too many failure modes.One thing I'd flag for Part 2: with 16 URLs you'll likely get full indexation quickly. The challenge scales non-linearly once you cross ~1,000 pages. At that point, crawl budget becomes the real bottleneck — Google discovers your URLs but simply refuses to crawl them because the domain doesn't have enough authority signals yet. Your link building plan (GitHub, Stack Overflow, dev.to) is exactly the right starting point for that.
The
content-visibility: autofix for mobile is underrated. Most Astro SSG sites skip it because "there's no JS so it's already fast" — but paint time still matters for LCP on slower mobile hardware.Looking forward to Part 2, especially the GA4 dashboard setup and real indexation numbers.
Thank you so much for the feedback and recognition! I'm really glad the methodology resonated with you — I always try to avoid vague "we improved things" claims and show the before/after concretely, with actual screenshots and real SERP data.
That robots.txt issue blocking the /_astro/ folder really is a classic trap. When I first started using Astro, it honestly didn't even cross my mind that this could be a problem, precisely because it's an SSG. I learned the hard way, watching important pages vanish from the index. And thank you for sharing your experience with hreflang on a 100k+ page site — knowing I'm on the right track with the build-time cross-linking approach (instead of runtime or plugins) gives me even more confidence. That's exactly why I chose it: fewer failure points.
Your prediction about crawl budget is spot on. I believe that with 16 URLs, full indexation really will come quickly. But I'm already mentally preparing for the "plateau" once the site scales. That relationship between domain authority and Google's willingness to spend crawl budget on new sites is a silent bottleneck that catches many people off guard. I'm glad the link building strategy (GitHub, Stack Overflow, dev.to) makes sense as a starting point — the idea is to create small "authority footholds" from the outside in.
And thanks for highlighting content-visibility: auto! Exactly. It's one of those "real-world performance" details that gets overlooked when we focus only on lab metrics. LCP on a modest mobile device still suffers, even with zero JavaScript, and this small CSS property makes a huge difference in initial paint.
Stay tuned because Part 2 is already in the works. Besides the real indexation numbers and GA4 setup, I'm putting together a dashboard that correlates Search Console data with the SEO actions implemented. And of course, some new tips should emerge about the next bottlenecks — including how to deal with that crawl budget "wall" you mentioned.
Once again, thanks for the comment and the exchange! It's conversations like these that make the community worth it. 🚀