How to Optimize Your WordPress Robots.txt for SEO
Recently one of our readers asked us for tips on how to optimize the robots.txt file to improve SEO.
Robots.txt file tells search engines how to crawl your website which makes it an incredibly powerful SEO tool.
In this article, we will show you how to create a perfect robots.txt file for SEO.
What is robots.txt file?
Robots.txt is a text file that website owners can create to tell search engine bots how to crawl and index pages on their site.
It is typically stored in the root directory, also known as the main folder, of your website. The basic format for a robots.txt file looks like this:
1 2 3 4 5 6 7 | User-agent: [user-agent name] Disallow: [URL string not to be crawled] User-agent: [user-agent name] Allow: [URL string to be crawled] Sitemap: [URL of your XML Sitemap] |
Here is what a robots.txt example file can look like:
1 2 3 4 5 6 | User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Sitemap: https://example.com/sitemap_index.xml |
After that, we have disallowed search bots from crawling and indexing plugins and WordPress admin folders.
Lastly, we have provided the URL of our XML sitemap.
Do You Need a Robots.txt File for Your WordPress Site?
If you don’t have a robots.txt file, then search engines will still crawl and index your website. However, you will not be able to tell search engines which pages or folders they should not crawl.
This will not have much of an impact when you’re first starting a blog and do not have a lot of content.
However as your website grows and you have a lot of content, then you would likely want to have better control over how your website is crawled and indexed.
Here is why.
Search bots have a crawl quota for each website.
This means that they crawl a certain number of pages during a crawl session. If they don’t finish crawling all pages on your site, then they will come back and resume crawl in the next session.
This can slow down your website indexing rate.
You can fix this by disallowing search bots from attempting to crawl unnecessary pages like your WordPress admin pages, plugin files, and themes folder.
By disallowing unnecessary pages, you save your crawl quota. This helps search engines crawl even more pages on your site and index them as quickly as possible.Another good reason to use robots.txt file is when you want to stop search engines from indexing a post or page on your website.
It is not the safest way to hide content from the general public, but it will help you prevent them from appearing in search results.
What Does an Ideal Robots.txt File Look Like?
Many popular blogs use a very simple robots.txt file. Their content may vary, depending on the needs of the specific site:
1 2 3 4 5 | User-agent: * Disallow: Sitemap: http://www.example.com/post-sitemap.xml Sitemap: http://www.example.com/page-sitemap.xml |
This robots.txt file allows all bots to index all content and provides them a link to the website’s XML sitemaps.
For WordPress sites, we recommend the following rules in the robots.txt file:
1 2 3 4 5 6 7 8 | User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-admin/ Disallow: /readme.html Disallow: /refer/ Sitemap: http://www.example.com/post-sitemap.xml Sitemap: http://www.example.com/page-sitemap.xml |
This tell search bots to index all WordPress images and files. It disallows search bots from indexing WordPress admin area, readme file, and cloaked affiliate links.
By adding sitemaps to robots.txt file, you make it easy for Google bots to find all the pages on your site.
Now that you know what an ideal robots.txt file look like, let’s take a look at how you can create a robots.txt file in WordPress.
How to Create a Robots.txt File in WordPress?
There are two ways to create a robots.txt file in WordPress. You can choose the method that works best for you.
Method 1: Editing Robots.txt File Using All in One SEO
All in One SEO also known as AIOSEO is the best WordPress SEO plugin in the market used by over 2 million websites.
It’s easy to use and comes with a robots.txt file generator.
If you don’t have already have the AIOSEO plugin installed, you can see our step by step guide on how to install a WordPress plugin.
Note: Free version of AIOSEO is also available and has this feature.
Once the plugin is installed and activated, you can use it to create and edit your robots.txt file directly from your WordPress admin area.
Simply go to All in One SEO » Tools to edit your robots.txt file.
First, you’ll need to turn on the editing option, by clicking the ‘Enable Custom Robots.txt’ toggle to blue.
With this toggle on, you can create a custom robots.txt file in WordPress.
All in One SEO will show your existing robots.txt file in the ‘Robots.txt Preview’ section at the bottom of your screen.
This version will show the default rules that were added by WordPress.
These default rules tell the search engines not to crawl your core WordPress files, allows the bots to index all content, and provides them a link to your site’s XML sitemaps.
Now, you can add your own custom rules to improve your robots.txt for SEO.
To add a rule, enter a user agent in the ‘User Agent’ field. Using a * will apply the rule to all user agents.
Then, select whether you want to ‘Allow’ or ‘Disallow’ the search engines to crawl.
Next, enter filename or directory path in the ‘Directory Path’ field.
The rule will automatically be applied to your robots.txt. To add another rule clicks the ‘Add Rule’ button.
We recommend adding rules until you create the ideal robots.txt format we shared above.
Your custom rules will look like this.
Once you’re done, don’t forget to click on the ‘Save Changes’ button to store your changes.
Method 2. Edit Robots.txt file Manually Using FTP
For this method, you will need to use an FTP client to edit robots.txt file.
Simply connect to your WordPress hosting account using an FTP client.
Once inside, you will be able to see the robots.txt file in your website’s root folder.
If you don’t see one, then you likely don’t have a robots.txt file.
In that case, you can just go ahead and create one.
Robots.txt is a plain text file, which means you can download it to your computer and edit it using any plain text editor like Notepad or TextEdit.
After saving your changes, you can upload it back to your website’s root folder.
How to Test Your Robots.txt File?
Once you have created your robots.txt file, it’s always a good idea to test it using a robots.txt tester tool.
There are many robots.txt tester tools out there, but we recommend using the one inside Google Search Console.
First, you’ll need to have your website linked with Google Search Console. If you haven’t done this yet, see our guide on how to add your WordPress site to Google Search Console.
Then, you can use the Google Search Console Robots Testing Tool.
Simply select your property from the dropdown list.
The tool will automatically fetch your website’s robots.txt file and highlight the errors and warnings if it found any.
Final Thoughts
The goal of optimizing your robots.txt file is to prevent search engines from crawling pages that are not publicly available. For example, pages in your wp-plugins folder or pages in your WordPress admin folder.
A common myth among SEO experts is that blocking WordPress category, tags, and archive pages will improve crawl rate and result in faster indexing and higher rankings.
This is not true. It’s also against Google’s webmaster guidelines.
We recommend that you follow the above robots.txt format to create a robots.txt file for your website.
Improve your site management with these WordPress Robots.txt best practices. The robots txt file, if one exists on your site, tells bots what they are and are not allowed to access. These bots include search engine crawlers, SEO software and even malicious software.
Discover how to create and optimize the WordPress robots.txt file in this tutorial. This tutorial is the robots.txt file explained for beginners.
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
1.
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Sitemap: https://example.com/sitemap_index.xml
2.
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/
Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml
3.
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /archives/
Disallow: /*?*
Disallow: /comments/feed/
Disallow: /refer/
Disallow: /index.php
Disallow: /wp-content/plugins/
User-agent: Mediapartners-Google*
Allow: /
User-agent: Googlebot-Image
Allow: /wp-content/uploads/
User-agent: Adsbot-Google
Allow: /
User-agent: Googlebot-Mobile
Allow: /
Sitemap: https://oyepandeyji.com/sitemap.xml
Robots.txt for WordPress in Hindi: Simple Beginners Guide
WordPress blog के SEO के लिए robots.txt file का एक important role रहता है. Search engine bots आपके blog/ website को कैसे crawl करेगा यह robots.txt file decide करता है.
Blogger के लिए custom robots.txt file कैसे बनाएं इसके ऊपर already post बना चुका हूं. आज wordpress के लिए robots.txt file कैसे बनाया जाता है और उसे update किया जाता है वह बताने वाला हूं. क्योंकि Robots.txt को edit करते वक्त अगर आप एक छोटी सी भी गलती करते हैं तो आपका blog search engine में कभी भी index नहीं होगा.तो चलिए जानते हैं Robots.txt file क्या है और WordPress के लिए एक perfect robots.txt file कैसे बनाएं.
What is Robots.txt in Hindi
जब भी search engine bots आपके blog पर आते हैं वह उस page के सारे links को follow करते हैं और उन्हें crawl करके index करवाते हैं. इसके बाद ही आपका blog और post, pages Google, Bing जैसे search engine में show करता है.
एक blog या website में post के अलावा और भी बहुत सारी चीजें हैं जैसे कि pages, category, tags, comments etc. लेकिन यह सारी चीजें एक search engine के लिए useful नहीं है. Generally एक blog पर search engine से traffic आता है उसके main url (https://oyepandeyji.com) से, posts, pages या images से इनके अलावा archaive, pagination, wp-admin जैसे चीजें search engine के लिए जरूरी नहीं है. यहां robots.txt, search engine bots को ऐसे unnecessary pages को crawl ना करने की instruction देता है.
अगर आपके Gmail पर कभी Index coverage का mail आया होगा तो आपने इस तरह का massage जरुर देखा होगा;
New issue found: Submitted URL blocked by robots.txt
यहां आपका जो URL है उसे crawl करने की अनुमति robots.txt नहीं दे रहा है, इसी वजह से वह url search engine के लिए blocked है.
यानी आपके blog का कौन सा web pages, Google या Bing में show करेगा और क्या नहीं यह robots.txt file decide करता है. इसमें एक भी गलती आपके पूरे blog को search engine से remove कर सकता है. इसलिए नए blogger इसे खुद create करने के लिए डरते हैं.
अगर अभी तक आपने robots.txt file को blog में update नहीं किया है, तो सबसे पहले आप इसकी कुछ basic rule को समझें और अपने blog के लिए एक perfect seo optimized robots.txt file बनाएं.
Create WordPress Robots.txt
WordPress default robots.txt
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: [Blog URL]/sitemap.xml
जैसा कि आप ऊपर देख सकते हैं robots.txt के लिए कुछ code/ syntax का इस्तेमाल किया जाता है. लेकिन आप में से ज्यादातर blogger इन syntax को बिना समझे ही अपने blog में इस्तेमाल करते हैं. पहले आप इन syntax का मतलब समझे उसके बाद आप खुद से अपने blog के लिए एक proper robots.txt code बना सकते हैं.
User-Agent: यह Search Engines Crawlers/Bots को instruction देने के लिए use किया जाता है.
User-agent: * इसका मतलब सारे search engine bots (Ex: googlebot, bingbot etc.) आपके site को crawl कर सकते हैं.
User-agent: googlebot
यहां सिर्फ Google bot को ही crawl करने की अनुमति है.
Allow: यह tag search engine bots आपके web pages और folder को crawl करने की अनुमति देता है.
Disallow: यह syntax bots को crawl और index करने से रोकता है, ताकि कोई दूसरा उसे access ना कर सके.
1. यदि आप चाहते हैं कि आपकी site के सभी page और directory को index करे. यह syntax आपने Blogger robots.txt file में देखा होगा.
User-agent: *
Disallow:
2. लेकिन यह code आपकी site के सभी page और directory को index होने से block करेगा.
User-agent: *
Disallow: /
3. यदि आप Adsense का इस्तेमाल करते हैं तभी इस code को use करें. यह AdSense robots के लिए है जो ads को manage करते हैं.
User-agent: Mediapartners-Google*
Allow: /
Example: अगर इस तरह का robots.txt file है, इसमें दिए गए rules का मतलब क्या है चलिए जानते हैं;
User-agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp- admin/
Disallow: /archives/
Disallow: /refer/
Sitemap: https://oyepandeyji.com/sitemap.xml
आप wordpress के अंदर जो भी files से images upload करते हैं वह /wp-content/uploads/ में save होता है. इसलिए यह code सारे images और files को index करने की permision देता है और WordPress plugin files, WordPress admin area, category page, और affiliate links को crawl करने के लिए search bots को disallow करता है.
Robots.txt file में sitemap add करने पर, search engine bots आपकी site के सभी pages को आसानी से ढूंढ सकते हैं.
अपने blog/ website के जरूरत के हिसाब से आप अलग-अलग तरह के robots.txt file बना सकते हैं. ऐसा जरूरी नहीं है कि मैं जो robots.txt code इस्तेमाल करता हूं आप भी वही करें.
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /archives/
Disallow: /*?*
Disallow: /comments/feed/
Disallow: /refer/
Disallow: /index.php
Disallow: /wp-content/plugins/
User-agent: Mediapartners-Google*
Allow: /
User-agent: Googlebot-Image
Allow: /wp-content/uploads/
User-agent: Adsbot-Google
Allow: /
User-agent: Googlebot-Mobile
Allow: /
Sitemap: https://oyepandeyji.com/sitemap.xml
Note: यहां Sitemap के अंदर अपने blog का sitemap replace करें (Your site name instead of https://oyepandeyji.com ).
WordPress में robots.txt file कैसे Update करें
Step 1: अगर आपका wordpress पर blog है तो आप जरूर seo हो के लिए Yoast SEO plugin इस्तेमाल करते होंगे. सबसे पहले Yoast SEO में जाकर Tools button पर click करें. यहां आपके सामने तीन tools open होंगे, आपको इनमें से File editor पर click करना है.
Step 2: File editor पर click करने के बाद एक page open होगा यहां Robots.txt section के नीचे Create robots.txt file button पर click करें.
Step 3: Next Robots.txt के नीचे जो box है वहां आप default code को remove करके ऊपर जो robots.txt code दिया गया है उसे paste करें और Save Changes to Robot.txt पर click करें.
WordPress में robots.txt file को update करने के बाद Search Console में robots.txt tester tool के जरिए कोई error है या नहीं उसे check करना जरूरी है. यह tool आपकी website के robots.txt फ़ाइल को automatically fetch करेगा और यदि errors और warning है तो उसे show करेगा.
आपकी robots.txt file का मुख्य लक्ष्य है search engine को उन pages को crawl करने से रोकना है, जो सार्वजनिक करना जरूरी नहीं है. आशा करता हूं यह guide आपको एक seo optimized robots.txt code बनाने में मदद करेगा.
WordPress Site Me SEO Ke Liye Sahi Robots.txt File Kaise Add Kare
Namaskar Dosto, Is post me Bloglon aapko batane ja raha hai ki WordPress Site Me SEO Ke Liye Sahi Robots.txt File Kaise Add Kare, Jabki maine apni pichhli post me aapko bataya tha ki Microsoft Account Kaise Banate hai. Me kuchh dino se lagatar Social media par dekh raha tha ki jo newbies hai vo apne wordpress blog me SEO ki ek bahut badi problem se gujar rahe hai. Jiski vajah se kuchh logo ne ya to apne blog sell karne ka mood bana liya ya fir blogging ko chhodne par majboor ho gaye hai. Kyonki unhone sayad kisi newbie ke blog ko follow kiya aur usi ki post ke hisab se apne blog ke liye Robots.txt file create karke add kar di ya fir unhone robots txt generator tool ko use karke galat robots.txt file bana li hai. jo unke blog ke SEO ke liye negative impact dal rahi hai. Kuchh logo ke blog to search engine me hi index hona band ho gaye hai.
Robots.txt File kya hai ?
Yadi me iske bare me batana chahunga ki yah ek aisi file hoti hai jisme kuchh code likha jata hai jise search engine (Google, Yahoo, Bing, Yandex aur Ask.com etc) strictly follow karte hai. Matlab aap iske dwara apne blog ke content ko search engine me kis tarah index karwana chahte hai isme yahi decide kiya jata hai.
Ise bhi padhiye : Keyword stuffing kya hai aur isse Blog ko kaise bachaye
Robots.txt file me jo bhi command aap insert karenge use google ya other search engine ko follow karne ke liye majboor hona padta hai. Isme yadi aap chahe to apne blog ke kisi bhi post ko search engine me index hone se rok sakte hai. Chaliye thoda ise example ke through samajhne ki koshish karte hai ki akhir is file ke andar kis tarah ki coding dikhai deti hai.
WordPress website robots.txt file example –
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap:
Ye keval example hai jo sabhi wordpress blog me pre-install hoti hai. Ise edit karke ham apne blog ko aur jyada SEO friendly bana sakte hai. To chalo ise edit karne se pahle ye bhi jan lete hai ki ise edit karna kyo jaruri hota hai.Robots.txt File Edit karna kyo jaruri hai ?
Yadi aap default robots.txt file apne wordpress site ya blog me use karenge to search engine aapke blog ke every type ke content ko index karta hai. Jaisa ki ham sab jante hai ki wordpress me hamari bahut sari aisi bhi files hoti hai jinhe hame index nahi karwana chahiye jinme se kuchh files aapke blog ki security bhi kamjor banati hai.
Yadi aapke blog me koi aisa page ya post hai jise aap use search engine se chhupana chahte hai aur chahte hai ki vo specific page ya post search engine index na kare to aapko is file ko edit karke us post ko index hone se block karni hoti hai, Vaise jyadatar bloggers Yoast SEO plugin apne wordpress site me seo ke liye use karte hai, to isse bhi kisi post ya page ko aap index hone se block kar sakte hai.
Iska sabse achha aur mainly use ye hota hai ki apne wordpress blog ki kisi bhi directory folder ya file ko aap asani se blog kar sakte hai, Jo aapke visitors ke liye kisi bhi kaam ki nahi hoti hai balki unhe kuchh black hat hacker aapki site ko hack karne me use karte hai.
Ise bhi padhiye : Long tail keywords kaise khoje keyword research guide
Vaise me aage batane wala hu ki vo kaun kaun si directory folder hai jinhe aapko apne wordpress blog ke liye search engine me index hone se kaise rok sakte hai.
Yadi aap apne blog me custom robots.txt file sahi tarike se use karte hai to aapka blog aur adhik SEO optimized ho jata hai kyonki search engine vahi content ko scan karta hai jise aapne is file ki help se allow kar rakha hai. Aur yadi jab koi user content search karta hai tab search engine aapke blog ko jaldi scan kar lega kyonki aapne faltu cheeje Disallow kar rakhi hai. Isse aapke blog ka traffic bahut achha ho jata hai. Yahi iska sabse bada profit hota hai ki aapke blog ka SEO optimization aur jyada strong ho jata hai.
To chaliye ab jante hai ki SEO ke liye itni important file ko sahi tarike se kaise edit kar sakte hai matlab how to create robots txt file for website.
WordPress blog me sahi Robots.txt File kaise banaye ?
Yadi aap apne blog me SEO Yoast plugin use kar rahe hai to is file ko create karna bahut hi simple ho jata hai. to chaliye aap hamare dwara bataye gaye steps ko carefully follow kare –
Step – 1
Apne WordPress Dashboard ko login kariye.
Step – 2
SEO par jaye aur Tools par click kijiye.
Step – 3
Ab File editor par click kijiye.
Step – 4
1. Edit the content of your robots.txt wale field Yadi aapke blog me default Robots.txt available hai to use niche diye gaye code se replace kar de.
User-agent: *
Disallow: /wp-admin/
Disallow: /cgi-bin/
Disallow: /comments/feed/
Disallow: /trackback/
Disallow: /xmlrpc.php
Allow: /wp-admin/admin-ajax.php
User-agent: Mediapartners-Google*
Allow: /
User-agent: Googlebot-Image
Allow: /wp-content/uploads/
User-agent: Adsbot-Google
Allow: /
User-agent: Googlebot-Mobile
Allow: /
Sitemap: https://hindimeonline.com/post-sitemap.xml
Sitemap: https://hindimeonline.com/page-sitemap.xml
Sitemap: https://hindimeonline.com/category-sitemap.xml
Sitemap:
Note – Sitemap {https://hindimeonline.com/sitemap.xml} ki jagah apne blog ki sitemap dale.
2. Ab save changes to Robots.txt par click karke save kar de.
Ab aapki robots.txt file successfully aapke wordpress blog ke database me add ho chuki hai Lekin abhi aapko aage bhi bahut kuchh karna hai. Isliye aap hame follow karte rahiye.
Ab aap soch rahe honge pata nahi aapne hamare blog ke liye kya kya search engine me allow kiya hai aur kya kya disallow kiya hai, Aise sawal kisi bhi nebie ke mind me aa sakte hai kyonki me dekh raha hu ki is samay hindi me bhut sare blog bane huye hai aur unke achhe followers bhi hai lekin mene khud kai sare aise blog ki post read ki hai to pata chala ki inhone wrong information share kar rakhi hai jisse unke follower unhe follow karne par apne blog ka RAM NAAM SATYA kar lete hai. Ab aisa vo kyo karte hai, ho sakta hai unhe us topic ke bare me deep knowledge na ho ya fir other koi reaon ho sakta hai ye to unhe hi maloom hoga ya fir God ko. Lekin Bloglon aisa kabhi nahi karta kyonki yaha aapko vahi milega jiska iske author ko deep knowledge hai.
Ise bhi padhiye : Apne blog par daily 500 unique visitors kaise prapt kare
To chaliye ham aapko technically batate hai ki hamne is robots.txt file me kya kiya hai aur kyo kiya hai uske baad aapko ham aapko batayenge ki Robots.txt file ko Search console me kaise submit karte hai.
Robots.txt file ka great SEO ke liye technical description :
User-agent: * – Isme jo User-agent hai iske dwara ham search engine ko rules follow karne ke bare me bolte hai. Aur asterisk (*) ko ham isliye likhte hai kyonki ye all search engine ko follow karne ke liye bolta hai jaise Google, Yahoo, Bing, Yandex etc.
Disallow: / – Isko ham kisi content ya folder ko search engine ko no-index karne ke liye bolte hai ya search engine is content ko index na kare isliye likhte hai. Jabki forward slash (/) ye isliye likhte hai jisse search engine ko maloom rahe ki use perticular folder ya file ko index karne se roka gaya hai.
Allow: Ise ham isliye likhte hai jisse search engine samajh sake ki use kya kya index karna hai.
Robots.txt file me kya Disallow kare ?
Jaise ki maine upar code me diya hua hai jisme blog ki kuchh database files ko disallow kiya hai jaise /wp-admin/, /cgi-bin/, /comments/feed/, /trackback/, /xmlrpc.php. In sabko maine search engine ko index nahi karne ko bola hai. Kyonki ye files ya folders kisi visitors ke liye upyogi nahi hote aur inhe disallow karna ek best SEO ka rule hai.
Ab aap soch rahe honge ki ye to every Blogger batata hai to me aapko batana chahunga ki yadi aap upar diye gaye us code ko achhe se dekhenge to aapko bahut kuchh difference najar aayega jo aksar bahut sare blogger bahut badi galti kar dete hai. Ab me aage batane ja raha hu ki Robotos.txt file me kya kya Disallow nahi karna chahiye.
Robots.txt file me kya Disallow na kare ?
Jo Files aur folders hame Disallow nahi karne chahiye vo ye hai –
- /wp-includes/ – Is directory ko hame apne wordpress blog ke robots.txt file me Disallow rule set karke search engine ke liye block nahi karna chahiye kyonki ye bahut old SEO me ise Disallow kiya jata tha aur ab latest SEO ke liye ise block nahi kiya jata. Iska reason ye hai ki aaj ke samay me lagbhag sabhi blog theme me asynchronous JavaScript jise AJAX bhi kahte hai. Ise use kiya jata hai to yadi aap ise hi block kar denge to aapki site ka adhura part hi Crawl nahi hoga. Aur isse hamare blog ke SEO par negative impact padega, Matlab blog ka traffic khatm ho jayega.
- /wp-content/plugins/ – Is directory ko bhi hame block nahi karna chahiye kyonki yadi aap ise block kar denge to search engine, JavaScript aur CSS ko scan nahi kar payega jisse aapke blog content ko render karne me use problem hogi. Kyonki Latest SEO me google aapke blog ko with JavaScript aur CSS ke crawl karta hai. Yadi aap is directory ko block kar denge to aapka blog search engine render nahi kar payega. Aur iski vajah se aapke blog ka traffic khatm ho jayega.
- /wp-admin/ – Is directory ko bhi aap block nahi kar sakte kyonki yadi aap ise block karenge to is directory me ek admin-ajax.php naam ki link hoti hai. Yadi ise search engine me block karenge to bhi hamare blog ko index karne me problem hogi. Lekin ab aapke dimaag me ek question jarur aaya hoga ki upar diye gaye robots.txt code me to /wp-admin/ block kar diya hai aur ab bol rahe hai ki block nahi karna chahiye, to iske liye me aapko bata du ki aapko bilkul darne ki jarurat nahi hai. Yadi aap achhe se code ko dekhenge to aapko maloom chal jayega ki maine /wp-admin/ ko to block kiya hai lekin /wp-admin/admin-ajax.php ko allow kar rakha hai. jisse search engine ko blog ko render karne me koi problem nahi hogi.
Yadi aapko lagta hai ki ye information incomplete hai to aap Great SEO ke liye Yoast wordpress robots.txt post ko jarur padh sakte hai jisme aapko aur jyada technical knowledge mil sakta hai. to chaliye thoda aage badhte hai aur jante hai ki robots.txt file me apne wordpress blog ke liye kya kya allow karna chahiye.
Robots.txt file me kya Allow kare ?
Jaisa ki maine upar code me search engine ke liye allow kiya hai jaise :-
Allow: /wp-admin/admin-ajax.php – Iske baare me to aapko upar bata hi chuka hu ki yadi aap ise block kar denge to aapki site google me crawl nahi hogi.
User-agent: Mediapartners-Google*
Allow: / – Yadi aap adsense user hai to aap ise allow karke jarur rakhe nahi to aapki site par adsense ads serve nahi kiye jayenge. Isliye ise Disallow na kare. Aur yadi aap adsense use nahi karte hai to aap ise Disallow kar sakte hai. Iske bare me jyada technically jaane ke liye Robots.txt Google support page par visit kar sakte hai.
User-agent: Googlebot-Image
Allow: /wp-content/uploads/ – Ye directory isliye allow karna chahiye jisse blog ki images search engine me render ho sake. Yadi aap ise Disallow kar denge to google aapke blog ki images ko crawl nahi karega.
User-agent: Adsbot-Google
Allow: / – Ye basically adwords ke liye use hota hai jisse advertiser ke liye search engine ko render karne ke liye rule set kiya jata hai.
User-agent: Googlebot-Mobile
Allow: / – Isko isliye use kiya jata hai because jab koi mobile user content ko search karta hai to aapki site ko mobile version me crawl kiya ja sake. Yadi aapke mobile visitors hai to aapke liye ye bahut hi important hoga aur ham jante hai ki sabhi site par mobile se traffic lagbhag 50% to aata hi hai. Isliye ise allow karna bahut jaruri hota hai.
Sitemap: – Iska use to sabhi ko maloom hota hai yadi nahi maloom to me aapko batana chahunga ki sitemap me blog ka all content hota hai jaise post, page etc. To ise to hame apne blog ki robots.txt file me rakhna hi padega.
Important Note – Dhyan rahe yadi aapki sitemap ek se jyada hai to un sabhi ko bhi robots.txt file me add karna hoga. jaise post ki sitemap alag ho sakti hai aur page, category, tag etc sabhi sitemap ko usi tarah add karna hoga jis tarah maine upar diye gaye code me ek sitemap add ki hai.
Ise bhi padhiye : How to create Sitemap for your blog or website?
Ab aapne robots.txt file ki technical information ke bare me jaan liya hai, aur ab jante hai ki Robots.txt file ko search console me kaise submit karte hai.
Robots.txt file ko Google Search Console me kaise submit kare ?
Step – 1
- Sabse pahle apne Google search console Dashboard ko login kijiye.
- Ab apne us blog par click kijiye jiske liye ise submit karna hai.
Step – 2
- Crawl par click kariye jo left side me menu me asani se mil jayega.
- Robots.txt Tester par click kariye, Kyonki ise submit karne se pahle test karna hoga ki sahi bani hai ya kuchh error hai.
- Ab box me upar diye gaye robots.txt code ko paste kar dijiye.
- Ab Test button par click kijiye.
Yadi sitemap me koi error nahi aata hai test me sab kuchh ok bata raha ho to ab ham apne blog ki robots.txt file submit kar sakte hai. Kaise karna hai iske liye next step follow kijiye –
Step – 3
Ab aapke samne ek new popup window open hogi usme Submit button par click kar dijiye.
Congratulation ! ab aapke wordpress site ki robots.txt file google search console me successfully submit ho gayi hai. Ab aapka blog great SEO friendly ban chuka hai, aur ab aapka blog search engines achhe se aur kam samay me aapke content ko achhe se crawl kar payenge.
0 Comments
Thank you for your message, I see all your messages, it is not possible to reply many times due to busyness, I hope I will reply to everyone in time, thank you for being with me. Thanks you watching my content. Please like, Follow, Subscribe.