Robots txt basics

A robots.txt file, often referred to as robots.txt, is a text file used to provide instructions to web crawlers and search engine robots about which pages or sections of a website should be crawled or excluded from indexing. It's a fundamental tool for controlling how search engines interact with your site's content. Here's how to create and use a robots.txt file.
1. Add @nuxtjs/robots dependency to your project
      yarn add @nuxtjs/robots 
npm install @nuxtjs/robots
  
2. Add @nuxtjs/robots to the modules section of nuxt.config.js
      export default {
  modules: [
    // Simple usage
    '@nuxtjs/robots',

    // With options
    ['@nuxtjs/robots', { /* module options */ }]
  ]
}
  
3. Give your sitemap address with options robots in section of nuxt.config.js
      
export default {
	robots: [
	    {
	      UserAgent: '*',
	      Allow: '/',
	      Sitemap: 'https://listofseo.com/sitemap.xml',
	    },
	  ]
}
  
4. Check the address you gave ex : https://listofseo.com/sitemap.xml