Post by amirmukaddas on Mar 11, 2024 7:42:44 GMT
The big (age-old) problem of e-commerce sites are the parameterized URIs that we find in classic search filters, in pagination links, in those that offer different views for archive pages, sometimes in social sharing buttons (mannaggialloro). How do you plan to solve this problem? With the url parameters tool in Search Console? That may be fine, but Google may not take into account the settings you enter for individual parameters if they are practically everywhere on your site. In this case the solution is to establish a scanning order at the code level itself that characterizes the internal paths.
In particular, you can keep explicit paths on the page with URIs present as the HREF attribute of the <A> tag to move the bot as a priority towards the pages you are interested in positioning as products Denmark Telegram Number Data and archives. At the same time you can serve all the other paths, especially the parameterized ones above, as ajax dynamic loads, which Google will certainly see and be able to follow, but at a lower priority level. Conclusions: no shortcuts If you do it like this you don't need to put the pages in noindex, you won't need canonical and you won't have to block anything via robots.
You will have worked in the field, making Google understand directly what matters and what doesn't, regardless of what you see or don't see. You will have saved scanning resources that will be allocated only (and better) towards the pages relevant to your business model. What is missing from this reasoning? A good developer. What I have written about is not within everyone's reach, so if reading this article your webmaster objects, claiming that the measures that I consider insufficient are fine, don't get upset... and start looking for another web master.
In particular, you can keep explicit paths on the page with URIs present as the HREF attribute of the <A> tag to move the bot as a priority towards the pages you are interested in positioning as products Denmark Telegram Number Data and archives. At the same time you can serve all the other paths, especially the parameterized ones above, as ajax dynamic loads, which Google will certainly see and be able to follow, but at a lower priority level. Conclusions: no shortcuts If you do it like this you don't need to put the pages in noindex, you won't need canonical and you won't have to block anything via robots.
You will have worked in the field, making Google understand directly what matters and what doesn't, regardless of what you see or don't see. You will have saved scanning resources that will be allocated only (and better) towards the pages relevant to your business model. What is missing from this reasoning? A good developer. What I have written about is not within everyone's reach, so if reading this article your webmaster objects, claiming that the measures that I consider insufficient are fine, don't get upset... and start looking for another web master.