How to add a robots.txt file for a Django App ?

Published: December 10, 2022

Tags: Django; Python;

DMCA.com Protection Status

To add a robots.txt file for a Django project, there are several approaches possible. I choose to use a view since it was more convenient for me to manage with multiple sub domains:

Add a new url

In the urls.py file of your app, add the line:

url(r'^robots\.txt$', views.robots_txt_view, name='robots_txt_view'),

Add a new view

Now, let's create a view:

def robots_txt_view(request):
    content = '''User-Agent: *

Allow: /

Sitemap: https://www.example.com/sitemap.xml
        '''
    return HttpResponse(content, content_type='text/plain')

That's a minimal example.

Note: to check if robots.txt is valid or not go to goole robots.txt Tester and copy and paste content value from above.

For instance

User-Agent: *

Allow: /

Sitemap: https://www.example.com/sitemap.xml

is a valid robots.txt file (returns 0 error).

For more info see Create a robots.txt file or Robots FAQs.

Add a robots.txt file with subdomains

See previous tutorial How to test sub-domains locally with django (on Linux or Mac) ?.

Then edit the previous view to work with subdomains by adding the following lines:

  def robots_txt_view(request):
      full_url = request.build_absolute_uri()
      dn = full_url.split('/')[2]
      subdomain = dn.split('.')[0]
      content = '''User-Agent: *

  Sitemap: https://{}.example.com/sitemap.xml
          '''.format(subdomain)
      return HttpResponse(content, content_type='text/plain')