Home > Enterprise >  Robots.txt returns 404 instead of displaying text
Robots.txt returns 404 instead of displaying text

Time:11-14

Problem

When trying to request my robots.txt file at website.com/robots.txt, I always receive a 404 error.

Files

config > urls.py

from django.conf import settings
from django.contrib import admin
from django.urls import path, include, re_path
from django.views.generic import TemplateView
from django.conf.urls.static import static
from config import views
from django.conf.urls import handler404, handler500, handler403, handler400

handler404 = views.handler404
handler500 = views.handler500

urlpatterns = [
    path('admin/', admin.site.urls),
    path('accounts/', include('allauth.urls')),
    path('accounts/', include('django.contrib.auth.urls')),
    path('', include('pages.urls')),
    path('plants/', include('plants.urls')),
    path('robots.txt',TemplateView.as_view(template_name='robots.txt', content_type='text/plain')), 
]

config > views.py

from django.http import JsonResponse, Http404, HttpResponse
from django.shortcuts import render

def handler404(request, exception=None):
    return render(request, '404.html', status=404)

def handler500(request):
    return render(request, '500.html', status=500)

templates > robots.txt

User-Agent: *
Disallow: /admin/
Disallow: /accounts/

Parts of my folder structure that may be helpful to know

project
|
|---config
|   |
|   |---urls.py
|   |---views.py
|
|---templates 
    |
    |---pages
    |   |---about.html
    |   |---contact.html
    |
    |---404.html
    |---robots.txt

I've also tried using only exception instead of exception=None inside of my handler404.
I've tried to move the robots.txt views and urls into my pages app with the robots.txt within the pages template folder.
I've tried removing the content_type=text/plain from the url pattern.
I've read through several different tutorials and I'm not sure what I'm doing wrong.

CodePudding user response:

Some of your secondary urls.py files contain too general pattern which catches "robots.txt" thus URL dispatcher never reaches expected pattern declaration in the end of pattern list.

To fix the issue try moving robots pattern higher in the pattern list and/or review included urls.py files and make their patterns more specific.

Also please consider making "static" robots.txt file truly STATIC in terms of Django. Such files are not supposed to be served by Django backend on prod and do not need template engine involved.

  • Related