Resolve Missing Robots.txt When Using Route Groups in NextJS App Router

January 6, 2024

Intro

If you've worked with Next.js, there's a good chance you've encountered a project using Route Groups. Route Groups are a new feature of the Next.js App Router and are an easy way to logically group parts of your project within the /app folder without affecting URL routing.

If you're migrating a project that is still using the Pages Router to the App Router, and you have robots.txt within the /public folder, you may run into an issue where robots.txt can't be loaded once it's placed inside of a folder that's configured as a route group. The robots.txt file is important because it controls how search engines crawl and index your website.

My Problem

While I was configuring this blog, I wanted a way to have separate layouts for my main website and my Sanity Studio. Configuring a route group was perfect for this, so within my /app folder, I created two new folders: /(main-site) and /(sanity-studio).

.
└── app/
    ├── (main-site)/
    │   ├── robots.ts
    │   ├── sitemap.ts
    │   └── layout.tsx
    └── (sanity-studio)/
        └── layout.tsx

Since I knew that I wanted to have my studio route ignored by search engines, I figured placing my robots.txt file at the base of my /(main-site) route group would make the most sense logically. However, this led to 404 errors when I tried to test routing to the file.

The Solution

To resolve this issue, robots.txt must be at the base of the /app folder regardless of how many top-level route groups you have configured. In my testing, sitemap.xml can be placed at either the base of the /app folder or into a top-level route group and routing to it will work as expected.

.
└── app/
    ├── (main-site)/
    │   └── layout.tsx
    ├── (sanity-studio)/
    │   └── layout.tsx
    ├── robots.ts
    └── sitemap.ts

In my project, I placed both my robots.ts and sitemap.ts files at the base of the /app folder since they both are logically related. I'm also using TypeScript versions of these files to gain type-safety and to dynamically update my sitemap as new blog posts are added.

Conclusion

Whenever I run into a problem and find a solution, I always like to know why the solution works so I can better understand why what I originally tried to do was incorrect. In this scenario, the solution I found doesn't really help me understand why my original solution didn't work.

With the original folder structure I had, I would've expected routing to robots.txt to work since /(main-site) was a top-level route group. /(main-site) should be ignored by the App Router, so the URL path should've resolved to www.dariusmcfarland.com/robots.txt, but it didn't. The fact that routing to the sitemap worked in both cases adds to my confusion as well.

Maybe rewrites are the answer? If you know, please reach out to me on Twitter to clear things up. I hope this guide helped!