Common SEO And Google Search Console Issues With Solutions

Vikesh Sharma
3 min readFeb 29, 2024

--

Common SEO Issue In Website

  1. Pages couldn’t be crawled (DNS resolution issues)

Ans. Check and fix DNS server issues ( Clean DNS Cache, Try switching to a different set of DNS servers ), ensuring it responds quickly and reliably.

2. Pages with no viewport tag

Ans. Add a viewport meta tag ( <meta name=”viewport” content=”width=device-width, initial-scale=1.0"> ) to the head of each webpage to ensure proper scaling and rendering on all devices.

3. Pages with multiple canonical URLs

Ans. Ensure each page has only one canonical URL to provide a clear guideline to search engines.

4. Redirect chains and loops

Ans. Optimize redirect paths ( Use 301 Redirection) to be as direct as possible, and eliminate any redirect loops.

5. Issues with broken internal JavaScript and CSS files

Ans. Audit and fix any broken links to internal JavaScript and CSS files to ensure pages load correctly. Check for any typos, missing or incorrect directory paths, or outdated URL, missing semicolons, parentheses, or curly braces, correct these errors manually. After making changes to your JavaScript and CSS files, clear your browser’s cache and reload the webpage to ensure

6. Pages with low text-HTML ratio

Ans. Increase the amount of textual content or reduce unnecessary HTML code to improve the text-HTML ratio.

7. Subdomains don’t support SNI

Ans. Ensure that all subdomains are configured to support SNI for better security and compatibility.

8. Issues with uncompressed JavaScript and CSS files

Ans. Compress JavaScript and CSS files ( Add Minify Jss and Css Code )to reduce their size, improving load times and efficiency.

9. Pages use too many JavaScript and CSS files

Ans. Consolidate and minimize the use of JavaScript and CSS files to reduce load times and requests.

10. Issues with uncached JavaScript and CSS files

Ans. Implement caching strategies for JavaScript and CSS files to improve load times for repeat visitors.

Google Search Console Issue :-

  1. Submitted URL marked “noindex”

Ans. First, you need to check in Google Search Console (GSC) how many pages have been indexed. Also, check the sitemap to see if the pages marked as “noindex” are included in the sitemap or not. Go to the “Excluded” section in GSC and examine the pages excluded by ‘noindex’. In the first step, select the pages that you want to index. Then, check if the <meta name=”robots” content=”index, follow”> tag is present in the head section of those pages. If the tag is set to “noindex”, change it to “index, follow”, and resubmit the URLs in GSC. In the second step, remove the pages that you do not want to index.

2. Server error (5xx)

Ans. The “Server error (5xx)” issue appears in the Google Search Console’s coverage report. To fix this, you need to copy all the URLs of the affected pages. Then, go to the “Removals” option in the Google Search Console and select “Temporary Removals.” In the “New Request” section, paste all the URLs and click on “Next.” All the URLs will be added to the list, and after a few days, they will be removed.

3. Submitted URL seems to be a Soft 404

Ans. 1.Maybe page doesn’t have any content
2.Maybe page has just 1- 2 lines long content
3. Maybe pages has lorem ipsum dummy content
4. Maybe content on the page is not visible ( Rendering Issue )

All these issues may lead to a soft 404 error. To resolve them, add quality content, check for rendering issues, ensure there is substantial content on the page, and replace dummy content if present. After addressing these issues, resubmit all the URLs on Google Search Console.

4. Indexed, though blocked by robots.txt

Ans. To index specific pages, you should allow those pages in the robots.txt file. For this, use the “Allow” directive in the robots.txt file. If you do not want to, then remove that directive from the robots.txt file.

5. Discovered — currently not indexed

Ans. Search engine works in four phases: first, it will be discovered, then it will be crawled, then indexed, and finally it will be given a ranking.To solve this issue you have to improve the quality of the content on your page. Don’t use AI Content. Poor technical SEO can also be an issue on those pages, please check. Your server may not be able to handle the load.

6. Crawled — currently not indexed

Ans. Please interlink the relevant pages with relevant anchor text. Check if the pages are added to the sitemap or not. Verify if there are any server errors occurring on those pages; if there are, check the hosting to ensure the server is not down. Also, confirm that the pages are not set to “noindex.”

--

--

Vikesh Sharma

I Have 7 + Year of Experience in SEO/SMO/Digital Marketing And I Also Have Certification Related To Digital Marketing Work on Wordpress, PhP, HTML Etc Website