Unlock The Secrets of English Google SEO in December 2023
Unlock The Secrets of English Google SEO in December 2023
Table of Contents
- Introduction
- Meta Tags for Iframes
- Choosing a Category Structure
- Disallowing Crawling of Script Tags
- Indexing Issues after Moving to HTTPS
- English vs Chinese Characters in URLs
- Handling Meta Name Prerender Status Code
- Optimizing Page Design Changes
- Pros and Cons of Double Slashes in URLs
- Fixing "Video Outside the Viewport" Issue
- Transferring Image Rank to New URLs
- Dealing with 404 Errors
- Soft 404 Errors and Cloaking
- Recrawling Websites for Updated Information
- Indexing Sitemap Files
- Crawling of API Paths in JSON
- Improving Visibility of a Service-Based Business
- Redirecting PDF Downloads to Website
- Impact of Accessibility on Ranking
- Paid Links and Violations
- Taking Action on Reported Paid Backlinks
- Blocking Crawling in Robots.txt
- Impressions Decreasing despite Content Updates
- Removing Content from Google Index
- Inclusion of Company-Owned Blogs in Google News
- Handling Special Characters in Search Results
- Discrepancy between Google Search Console and SERP
- Negotiating SEO with Google for Free
- Removing Old Website from Search Results
- Trust between GA4 and Search Console
- Including Periods in URLs as Part of SKU
- Measuring SEO Perfection
Introduction
In this article, we will delve into various aspects of SEO based on questions and answers from Google's SEO Office Hours. We will cover topics such as meta tags for iframes, choosing a category structure for websites, handling crawling of script tags, indexing issues after moving to HTTPS, and much more. So, let's get started and explore the world of SEO!
Meta Tags for Iframes
๐ Question: Are there any meta tags required to ensure that the contents of an iframe are associated with the page using the iframe and not the original page?
๐ Answer: When a primary page embeds a subpage using an iframe element, search engines generally try to associate the subpage's content as part of the primary page for indexing. However, it is not guaranteed since both are separate HTML pages. To ensure that the subpage is only indexed as part of the primary page, you can use a combination of noindex plus index if embedded robot's meta tags on the subpage. On the other hand, if you want to prevent the subpage from being indexed as part of the primary page, you can use the appropriate ix frame options in the HTTP header to prevent the embedding of iframe elements.
Choosing a Category Structure
๐ Question: Which category structure is better for my website: hierarchical or flat structure?
๐ Answer: The choice between a hierarchical or flat category structure largely depends on the size of your website. For large websites, it is typically better to have a hierarchical structure that allows for easier navigation within different sections. This also enables search engines to treat different sections differently, especially when it comes to crawling. For example, having a separate news section for news content and archives can help search engines crawl news faster than the other sections. On the other hand, if you put everything in one directory, it may not be as efficient for crawling and organization.
Disallowing Crawling of Script Tags
๐ Question: How can I tell Googlebot not to search for links in selected script tags for JSON or JavaScript?
๐ Answer: If you want to prevent Googlebot from crawling certain URLs or content within selected script tags, you can disallow crawling on these URLs clearly in your robots.txt file. By doing so, Googlebot will not make a request to these URLs and will not consider any links or content within them for crawling or indexing. This helps ensure that the specific URLs or script tags are excluded from search results.
Indexing Issues after Moving to HTTPS
๐ Question: I built a new site on Core MVC and moved it to HTTPS, but I have problems with indexing the new pages.
๐ Answer: After migrating your site to HTTPS, if you are experiencing indexing issues, it's important to check how your site is currently indexed. Sometimes, the indexing may differ if the WWW subdomain is not included in the indexed version of your site. Make sure to search for your domain alone (e.g., site:domain.com) to find if it is indexed. If you are still facing issues, it is recommended to seek assistance to ensure proper indexing of your new pages.
English vs Chinese Characters in URLs
๐ Question: For non-English page URLs, would it be better to have English language in the slug or use Chinese characters?
๐ Answer: The choice between English or Chinese characters in non-English page URLs may not have a significant impact on SEO. Both options are generally acceptable. However, using the language of the content in the URLs can sometimes be helpful for search and for users who may prefer URLs in their native language. Ultimately, the decision should be based on your target audience and the overall user experience.
Handling Meta Name Prerender Status Code
๐ Question: What does Googlebot do when it finds a meta name prerender-status-code content 404 tag?
๐ Answer: Currently, Googlebot ignores the status code specified in the meta name prerender-status-code tag. This is typically encountered in single-page applications that are client-side rendered. To avoid soft 404 errors, consider adding a meta tag for robots noindex or redirecting to a page that returns a 404 status code. For more information on handling such scenarios, refer to Google's documentation at developers.google.com/search.
Optimizing Page Design Changes
๐ Question: I'm preparing to launch a new website design with UI and UX improvements and new pages. Is it better to change the page designs one at a time?
๐ Answer: When it comes to making changes to your website design, it is crucial to be clear about what exactly is changing. Ideally, you should map out all the changes in a document and identify which ones might have SEO implications. If you are changing URLs, refer to Google's guidance on handling site migrations. Additionally, if you are uncertain about the potential SEO effects of your design changes, it is recommended to seek help from experienced professionals. Proper preparation and execution are key to avoiding SEO issues during a website redesign or migration.
Pros and Cons of Double Slashes in URLs
๐ Question: What is the SEO impact of using double slashes in a URL, such as in https:example.us//us/shop?
๐ Answer: From an SEO perspective, using double slashes in a URL, as seen in the example provided, is not technically an issue. According to RFC 3986, the forward slash is a separator and can appear multiple times in a URL path. However, from a usability standpoint, it may not be an optimal practice as it can confuse users and some crawlers. It is generally recommended to use a single forward slash in URLs for better readability and user experience.
Fixing "Video Outside the Viewport" Issue
๐ Question: How can I fix the "Video outside the viewport" issue in Google Search Console?
๐ Answer: If you encounter the "Video outside the viewport" issue in Google Search Console, the best way to address it is by positioning the video element at the top of the page. By placing the video at the top, users will instantly see it without having to scroll down, which resolves the error. This ensures a better user experience and increases the likelihood of the video being properly indexed and displayed in search results.
Transferring Image Rank to New URLs
๐ Question: Product image URLs are changed and now hosted on another server. How can I tell Google to transfer the current image rank to a new URL?
๐ Answer: To transfer the image rank from old URLs to new URLs, it is important to update the image elements in your website's code to point to the new image URLs. By doing so, you are indicating to Google that the images have moved. Additionally, it is recommended to set up redirects from the old image URLs to the corresponding new URLs. Keep in mind that the transfer of image rankings may take some time as search engines reprocess and update their systems.
Dealing with 404 Errors
๐ Question: My site had a lot of 404 pages which I requested to be removed after a few published articles. Should I worry about the 404 errors?
๐ Answer: 404 errors are a normal part of the web and are nothing to be afraid of. In fact, they are expected when outdated or non-existent URLs are accessed. However, it is essential to keep an eye on important pages that unexpectedly return a 404 status. If a page that you consider important is suddenly returning a 404, it is recommended to fix those specific URLs. Otherwise, the presence of 404 errors on your site shouldn't be a cause for concern.
Soft 404 Errors and Cloaking
๐ Question: The site returns HTTP 200 for 404 pages. Is it considered a soft 404 or cloaking? How bad is this?
๐ Answer: When a site returns an HTTP 200 status code for URLs that should ideally return a 404 status, it is generally considered a soft 404 error. Soft 404 errors are not considered cloaking, which is intentionally providing different content to users and search engines. While soft 404 errors are undesirable, they are not severe issues that would significantly impact your site's rankings. To avoid soft 404 errors, it is recommended to configure your server to always respond with the appropriate 404 status code or use JavaScript to add a meta robots tag with a value of noindex for those specific pages. For more details and best practices, refer to the JavaScript documentation on developers.google.com/search.
Recrawling Websites for Updated Information
๐ Question: How can I recrawl my website so that my students would find the new information?
๐ Answer: To ensure that your students discover the new information on your website, it is important to make it clear to search engines what changes have occurred. If you update a page, prominently mention the update on your website to bring it to the attention of both search engines and users. If you move content from one page to another, make sure to set up proper redirects from the old content to the new one. By doing so, even if search engines take some time to fully understand the updates, your users will be able to access the new content seamlessly.
Indexing Sitemap Files
๐ Question: Can the sitemap file link itself or the sitemap page itself be indexed?
๐ Answer: Technically, the sitemap file or the sitemap page can be indexed. However, it is generally not necessary or beneficial to force the indexing of the sitemap. Indexing the sitemap does not directly impact your site's rankings or visibility. If you want to prevent the sitemap from being indexed or remove it from search results, you can add the noindex robots tag to the sitemap or configure the corresponding HTTP header to send the noindex directive. However, it is important to note that indexing or not indexing the sitemap doesn't have a significant impact on your site's overall SEO.
Crawling of API Paths in JSON
๐ Question: We've recently been seeing a large increase in 404 errors based on Google picking up API paths in our raw JSON and crawling them. Is this something we should be worried about?
๐ Answer: If you notice Googlebot crawling API paths in your raw JSON and encountering 404 errors, there is no need to be overly concerned. Googlebot often tries to check if these URLs contain content that could be useful for indexing and displaying to users. However, since these API paths are not meant to be accessed directly, they return a 404 status. This is normal, and Google understands that not all URLs are intended to be accessed by users. If the 404 errors are limited to these API paths, there should be no negative impact on your site's SEO.
Improving Visibility of a Service-Based Business
๐ Question: Why doesn't my service-based business show up in search results for things like snow removal or de-ice in my town, which is one of my listed service areas?
๐ Answer: There could be multiple factors contributing to the lack of visibility for your service-based business in search results. One possibility is that the name of your business, "Whiteout," which is a common word, makes it challenging for search engines to identify and differentiate your business from others. To improve visibility, ensure that you have a Google Business Profile set up for your business and that it is linked to your various business listings. Additionally, make sure to mention your website prominently on your business profiles for better discoverability.
Redirecting PDF Downloads to Website
๐ Question: If a PDF ranks in the SERPs, can the user download the PDF and then be redirected to the site where the PDF is found?
๐ Answer: Redirecting users directly from a downloaded PDF to the website where the PDF is found is not possible. However, you can add a link within the PDF itself, encouraging users to visit your website for more information or related content. By providing an explicit call-to-action and linking to your site, you can guide users to visit your website after viewing the PDF.
Impact of Accessibility on Ranking
๐ Question: Is accessibility important for ranking, specifically considering PageSpeed Insights interpretation?
๐ Answer: Accessibility plays a vital role in ensuring a helpful and user-friendly website, although it may not have a direct impact on ranking. Some accessibility features, such as providing descriptive image alt tags, can also provide valuable information to search engines like Googlebot. However, it is important to remember that the primary goal should be to build a website that is accessible and useful for all users, rather than solely focusing on ranking factors. A website designed with accessibility in mind can result in better user experiences and broader reach.
Paid Links and Violations
๐ Question: Buying links from third-party websites is one of the violations on Google Webmasters. On what basis is that?
๐ Answer: Paid links violate Google's webmaster guidelines because they aim to manipulate search rankings artificially. Google values organic, natural links that are earned based on the quality and relevance of the content. Paid links, on the other hand, attempt to bypass this process and can mislead search engines and users. Google considers paid links as a form of link spam, and any attempts to manipulate search rankings through such practices can lead to penalties and a negative impact on your site's visibility.
Taking Action on Reported Paid Backlinks
๐ Question: I have reported thousands of paid backlinks with no noticeable enforcement by Google. How can I help Google act on these reports?
๐ Answer: Reporting paid backlinks to Google is valuable, and the information provided helps improve Google's algorithms. However, it's important to note that individual reports may not result in immediate actions or penalties. Google utilizes these reports to gain insight into trends and patterns to improve its systems as a whole. It is a continuous process rather than a case-by-case basis. If you have additional questions or concerns about specific cases, it is advisable to engage with the Help community, where you can discuss the issues and seek advice from experts.
Blocking Crawling in Robots.txt
๐ Question: My site returns HTTP 200 for 404 pages. Is it considered a soft 404 or cloaking? How bad is this?
๐ Answer: Returning an HTTP 200 status code for your site's 404 pages is generally considered a soft 404 error. While it is undesirable, it is not as severe as cloaking, which involves providing different content to search engines compared to what users see. Although soft 404 errors are not viewed favorably, they don't necessarily result in severe penalties for your site. To avoid soft 404 errors, you can typically configure your server to return a 404 status code for client-side rendered applications or use JavaScript to include a meta robots tag with a value of noindex. For more recommendations and information, refer to Google's JavaScript documentation.
Impressions Decreasing despite Content Updates
๐ Question: My website's impressions are continuously decreasing even though I've continuously updated new products according to trends. There are no warnings or penalties for my website. What could be the reason?
๐ Answer: Building and maintaining a successful online presence goes beyond just adding new pages or products to your website. While updating your content regularly is important, it is equally crucial to focus on aspects such as providing unique value, quality, and improving the user experience. Search engines and user behavior are constantly evolving, so it is essential to adapt and refine your SEO strategies over time. Sometimes, even with everything done correctly, fluctuations in impressions can occur due to factors beyond your control. There is no easy secret to online success, but persistence, continuous improvement, and staying informed about industry trends can contribute to long-term growth.
Removing Content from Google Index
๐ Question: How do I remove content from the Google index on my website?
๐ Answer: The most straightforward way to remove content from the Google index is by deleting it from your website. Once the content is removed, Google will eventually recrawl and reprocess your site, and the previously indexed URLs and associated content will be updated accordingly. However, keep in mind that it may take some time for search engines to revisit your site and update their indexes. If you need immediate removal, you can also use the Removals tool in Google Search Console or utilize the noindex directive to prevent certain pages from being indexed. For more detailed instructions and guidelines, refer to Google's documentation on content removal.
Inclusion of Company-Owned Blogs in Google News
๐ Question: Are company-owned blogs eligible to be included in the Google News feed?
๐ Answer: Although I primarily focus on SEO, I can provide some general information. Company-owned blogs, like any other source of news and information, may be eligible for inclusion in the Google News feed under certain conditions. While I don't have access to Google News policies, I recommend reviewing their content policies to see if your company blog meets the requirements. If you are already publishing news content and want to verify if your pages are being shown in Google News, you can review the performance reports in Google Search Console to gain insights into your site's visibility.
Handling Special Characters in Search Results
๐ Question: How does Google handle special characters like the superscript E in search results?
๐ Answer: The display of special characters in search results depends on how they are encoded in the HTML of your web pages. Specify the character encoding in your HTML using the proper meta elements and charset attributes, especially when using characters beyond the standard ASCII set. By explicitly defining the encoding, you can ensure that special characters, such as the superscript E, are rendered correctly in search results. Taking care of proper character encoding is crucial in maintaining the intended appearance and readability of your content.
Discrepancy between Google Search Console and SERP
๐ Question: Why is Google Search Console (GSC) positioned significantly different than the SERP search results?
๐ Answer: The performance data reported in Google Search Console is based on actual impressions and clicks received by your web pages in search results. However, due to the dynamic nature of search, it can sometimes be challenging to reproduce the precise positioning of your pages in the SERPs. To understand the discrepancies between Search Console and the SERPs, you can utilize filters in Search Console to narrow down the data based on specific parameters such as country or device. By experimenting with different filters and analyzing the data, you can gain a better understanding of how your site is performing in search results.
Negotiating SEO with Google for Free
๐ Question: Can I arrange SEO directly with Google for free?
๐ Answer: No, Google does not provide direct SEO services or negotiate SEO deals for individual websites or businesses for free. Google Search is designed to serve all users equally, and the search engine's ranking algorithms determine how websites are ranked based on relevance and quality signals. While Google provides extensive documentation, guidelines, and resources to help website owners understand and improve their SEO, there is no direct negotiation or exchange of services between Google and individual website owners to influence rankings.
Removing Old Website from Search Results
๐ Question: Our business is closed, but Google is still showing it in the website results. How can I remove the old website from search results?
๐ Answer: To remove an outdated or closed business from search results, you can request removal of the old website through Google Search Console. Within Search Console, there is a specific tool that allows you to request the removal of URLs from the search index. By submitting a removal request for the outdated website, you can expedite the process of removing it from search results. However, keep in mind that it may take some time for search engines to reflect the changes and completely remove the outdated website from their indexes.
Trust between GA4 and Search Console
๐ Question: With SEO reporting, should I trust GA4 (Google Analytics) or Search Console? What are the differences between both?
๐ Answer: Both Google Analytics and Google Search Console provide valuable insights and data related to your website's performance. However, it is important to understand that the data collected and reported by these tools can differ due to various factors, including how data is collected and interpreted. Google Analytics primarily focuses on user behavior and interactions on your website, such as traffic sources, page views, conversions, etc. On the other hand, Google Search Console focuses more specifically on organic search performance, providing information about impressions, clicks, and average rankings in search results. Both tools offer unique perspectives and should complement each other in understanding your overall website performance and SEO efforts.
Including Periods in URLs as Part of SKU
๐ Question: Can a URL contain periods as part of an SKU, as long as they are not immediately after each other?
๐ Answer: Including periods within URLs is generally acceptable from an SEO perspective, as long as they do not violate any other URL rules. Periods are treated as separators in URLs, and as long as they are used appropriately and not excessively, they should not cause any SEO issues. However, it is important to ensure that the overall URL structure is logical, readable, and user-friendly. Using periods within SKUs should be primarily driven by your website's design and organization, ensuring that they serve a clear purpose and do not cause any confusion for users.
Measuring SEO Perfection
๐ Question: How can I know if my SEO is perfect? Are there any tools, apps, or websites available for it?
๐ Answer: Achieving "perfect" SEO is an ongoing process rather than a one-time task. SEO involves numerous factors, including technical optimizations, content quality, user experience, and relevance to search queries. There is no definitive measure of perfection, as search engine algorithms constantly evolve, and different websites have unique requirements. Tools like Google Search Console, Google Analytics, and other SEO software can provide valuable insights, data, and suggestions for improvement. However, it is essential to remember that the goal should be to create a website that offers the best possible experience for users, rather than focusing solely on achieving perfection according to specific metrics or tools.
I am an ordinary seo worker. My job is seo writing. After contacting Proseoai, I became a professional seo user. I learned a lot about seo on Proseoai. And mastered the content of seo link building. Now, I am very confident in handling my seo work. Thanks to Proseoai, I would recommend it to everyone I know. โ Jean