Convertful
ImagePDFVideoAudioDocumentSEODeveloperUtilityGuides

Image Tools

  • Compress Image
  • Resize Image
  • Remove Background
  • HEIC to JPG
  • All Image Tools →

PDF Tools

  • Compress PDF
  • Merge PDFs
  • Split PDF
  • PDF to Images
  • All PDF Tools →

Video & Audio

  • MOV to MP4
  • FLAC to MP3
  • Video to GIF
  • All Video Tools →
  • All Audio Tools →

Developer

  • JSON Formatter
  • JWT Decoder
  • Regex Tester
  • SQL Formatter
  • All Developer Tools →

Document & SEO

  • Excel to JSON
  • JSON to Excel
  • All Document Tools →
  • SERP Snippet Preview
  • Robots.txt Generator
  • All SEO Tools →

Utility

  • QR Code Generator
  • Word Counter
  • Color Converter
  • Text Diff
  • All Utility Tools →
All processing happens in your browser. Your files never leave your device.
AboutGuidesTermsPrivacyContact
© 2026 Convertful. All rights reserved.
HomeSEORobots.txt Generator

Robots.txt Generator

Generate robots.txt rules locally. Free, private, runs in your browser.

100% private — your files and text never leave your browser. All processing happens locally on your device.

Rules

Extras

robots.txt

# Generated locally with Convertful
# Review before publishing.

User-agent: *
Allow: /
Disallow: /api/

Sitemap: https://example.com/sitemap.xml
Review rules carefully before publishing — robots.txt can block important pages from search engines.

You might also need

SERP Snippet PreviewPreview local search title and description snippets
Schema Markup GeneratorGenerate JSON-LD schema markup locally
Robots.txt TesterCheck whether Googlebot can crawl a URL, with rules grouped by User-Agent
URL Encode / DecodeEncode or decode URL strings

Robots.txt Is A Crawl Hint

Robots.txt tells cooperative crawlers what they may fetch. It is not authentication, privacy protection, or a guaranteed deindexing mechanism. If a page must stay private, protect it with access control instead of relying on robots.txt.

Review Before Publishing

A broad Disallow can accidentally block important pages, CSS, JavaScript, image paths, or your entire site. Generate the file locally, read each user-agent group, and test important URLs before publishing it at `/robots.txt`.

Sitemap And Crawl Delay

A Sitemap line helps crawlers discover your XML sitemap location. Crawl-delay is recognized by some crawlers but not by Googlebot, so treat it as a courtesy setting rather than a universal crawl-rate control.

FAQ

Does the robots.txt generator fetch my website?

No. It only turns the rules you enter into a robots.txt file. Convertful does not crawl your site, test URLs, or submit anything to search engines.

Can Convertful publish robots.txt for me?

No. Download the generated file, review it carefully, and publish it yourself at your site's `/robots.txt` path when you are ready.

Can robots.txt hide private content?

No. Robots.txt is crawler guidance, not access control. Sensitive pages should require authentication or be removed from public access.

Can a bad robots.txt hurt SEO?

Yes. A broad Disallow rule can block important pages, assets, or entire sections from crawling. Review each user-agent group before publishing the file.

Should I include a sitemap line?

Usually yes. A `Sitemap:` line helps crawlers discover your XML sitemap location, especially when it is not at the default `/sitemap.xml` path.