Deep SEO: WordPress robot. Understanding and optimizing TXT

For SEO, most people fully understand the basic content. They understand keywords and know how keywords should be displayed in different places throughout the content. They have heard about the SEO in the page, and may also cause a vortex in the word press SEO plug-in. However, if you carefully observe the core of search engine optimization, you will find that some fuzzy puzzle pieces are unknown to everyone. One of them is a robot. Txt file. Robots What is a TXT file and where is it used? Robots Txt file is a text file on the server. It contains site index creation rules and is a tool for communicating directly with search engines. The
By default, the file tells the site where Google can create an index and where it should keep it. But why not let Google scroll through your site? From the perspective of SEO, isn’t this harmful? In fact, there are many reasons why we instruct Google not to scroll on the site. Robots One of the most common uses of TXT is to exclude websites from search results that are still in development. The same is true for the dump version of the site that you are trying to change before committing the live version. Or, because the server is dedicated to users, you may have some files that you do not want to display on the Internet. The
Robots Must TXT exist? You are definitely a robot in the field. Should I take TXT? No, the WordPress website creates an index in the search engine, even if there is no corresponding file. In fact, WordPress itself already has a virtual robot. Txt. That is, it is recommended to create a physical copy on the server. It makes things much easier. But one thing to note: robots. In other words, you cannot force txt to obey. Files are recognized and respected in major search engines, but malicious rollback and low-quality search rollback can completely ignore files. The
How? Where? Own robot. Create TXT is to create a text file with the selected editor, and then use robots. It’s as easy as calling TXT. Just save and it’s over. It’s really easy. I see. Step two. Upload via FTP. Even if you move WordPress to your own directory, the files are usually located in the root folder. Usually index. Save to the same location as PHP and wait for the upload to complete. For each sub domain of the site and various protocols such as HTTPS, a separate robot. Txt file is required. The
Robots Now let’s talk about the content. Robots Txt has a unique syntax for defining rules. These rules are also known as \
User agent – defines the search engine rollback program. Disallow – instructs the rollback program to stay away from defined files, pages, or directories. If you do not want to set other rules for other scrolls or search engines, you can use the asterisk (*) to define common instructions for all scrolls. For example, to block everyone on the entire website, use a robot: Configure TXT. User agent:*
Not allowed: \/ by default, all directories
May not be displayed and some adjustments are required. Yoast also uses robots to hide low-quality content such as categories, dates, and other archives. Do not use the txt instruction and use the Noindex and follow operations. It is strongly recommended to use the Noindex and follow meta tags instead. In addition, for the reasons mentioned above, there is no reference to the site map in the file. Matt Mullenweg, the creator of WordPress, uses a similar minimal approach. User agent:*
Not allowed:
User agent: medipartners Google*
Not allowed:
User agent:*
Not allowed: \/dopbox
Not allowed: \/contact
Not allowed: \/blog\/wp-login. PHP
Not allowed: \/blog\/wp-admin can see that he only blocks his own archive accounts and contact folders, important management and login files, and WordPress folders. For security reasons, some people choose the latter, but yoast actually does not recommend hiding the WP admin folder. The following example is imported from wpbeginer. User agent:*
Allowed: \/? Display = Widescreen
Allow: \/wp-content\/uploads\/
Not allowed: \/wp-content\/plugins\/
: \/readme not allowed. HTML
Not allowed: \/ see\/
Site map: http:\/\/www.wpbeginner.com\/post-sitemap.xml
Site map: http:\/\/www.wpbeginner.com\/page-sitemap.xml
Site map: http:\/\/www.wpbeginner.com\/deals-sitemap.xml
Site map: http:\/\/www.wpbeginner.com\/hosting-sitemap.xml Plug in and readme. In addition to the HTML file, you can also see that collaboration links are blocked (see the references folder). As described in this article, the latter is to avoid malicious queries against a specific version of word press. If you do not allow files, you can protect yourself from large-scale attacks. Blocking plug-in folders also prevents hackers from passing vulnerable plug-ins. Here, they take a different approach from yoast to avoid losing styles in the plug-in folder. Wpbeginer differs from the other two examples in that it explicitly sets WP content \/ uploads to \
Do more actively. Due to the ugly junk information on the blocked page, the ideal robot file does not allow any content. If the correct sitemap is generated, it can be connected to the XML sitemap (although this itself is rare!). WordPress only intercepts a few JS files by default, but almost follows Google’s instructions. \

Author:

Leave a Reply

Your email address will not be published. Required fields are marked *