The robots.txt is a small text file on your page that tells search engines how to deal with it, e.g. which parts should or should not be indexed.
By default, every Box has a system-internal robots.txt file. There are differences, however, between the files in Demo Boxes and activated (mostly paid) Boxes.
Important: You will not find the standard robots.txt in your system's file system because it is generated dynamically.
robots.txt for Demo Boxes
By default, Demo Boxes always have these entries in robots.txt:
User-agent: *
Disallow: /
This completely excludes Demo Boxes from being indexed by search engines.
robots.txt for activated Boxes
As soon as you have activated one of your Boxes and connected it to a domain, the robots.txt will change to the following entries:
User-agent: *
Disallow:
These are the default entries for WordPress sites and, among other things, ensure that your site can now be found by search engines.
Custom robots.txt
As soon as you upload a custom robots.txt, it replaces the default file. If you want to upload your own robots.txt, simply place it directly in the wordpress folder in your file system.
To create a custom robots.txt, you can either use a program like TextEdit on Macs or Notepad on PCs or use the function in your SEO plugin.
A custom robots.txt could look like this:
User-Agent: *
Disallow: /yoursite
Disallow: /yoursite2

