Our competitors from India and China use bots to generate many spam visits to our website, stay for a very short time, and leave, which lowers the average visit duration of our website.
A low average visit duration will damage SEO and lower our ranking in Google.
- How to prevent this?
- Will Google count in the stay time of Google Bots and other bots? I think these bots will also stay for a short time.
- Will Google count in the stay time for 404, 500 and other error status codes? These error pages will also stay for a short time.
In such a case, how to let Google know the original post is the cornerstone, while the other post that is linked to is NOT? Using nofollow or other attributes?
If the link is relevant and it's helpful for your visitors, just do it. Internal linking is a great way to enlarge your visitors dwell time. A lot of SEOs believe that Google does use dwell time as a ranking factor. It shows that your content is relevant to the search query. So it might give you, in time, a better result in the SERP.
Thank you very much.
Based on Yoast, one should add internal links from other pages to the cornerstone page.
Then is it OK to add internal links from the cornerstone page to others? Will it lose the importance due to add such links.
Based on online guide, Google likes fresh contents. So it is better to update the contents frequently.
However, for a product information page, most of the contents will not change frequently, as below:
- Introduction of the product.
- Main features of the product.
- FAQs of the product.
- Testimonials of the product.
- Screenshots of the product.
- Guide of the product.
- Case studies of the product.
- Related products.
None of the above will change every day. Also it is unlikely to upgrade the product version every day or every week.
- How can make the product information page updated frequently?
- How many contents should be updated(for example, 20% or 50%) so that to make Google think it is fresh?
Thank you so much!
It is not so. As much relevant content is on the page that much it can benefit the page. Just need to mind the user experience. As you have explained, it looks good in terms of UX too.
In our software product page, we put the following contents:
- Production introduction
- Comparison with other competitors.
- Customer testimonials
- How to use guide
- Case studies
- Related products.
The total word count becomes 6301. Most of the contents(about 4000 words) come from the testimonials as there are many customer testimonials.
I use a slideshow to show the testimonials.
Now my question is. Will so many testimonials damage the page SEO? Should I remove them or reduce them?
I have a post. I use siteliner to check internal duplicate contents and find for this post, all the duplicate contents are the common items:
- Top menu items.
- Left panel items.
![enter image description here](https://www.datanumen.com/temp/images/internal-duplicate.jpg "enter image title here")
While the main body does not contain duplicates.
In such a case, how to deal with the duplicate contents in this post?
I have two articles on my site. Both articles share the the same paragraph, describing the steps to fix a computer error.
How to eliminate the duplicate contents?
If I put that paragraph into a new article, then let the old two links to the new one. Then the duplicate contents are eliminated. However, there are too few words in the new article, which makes it as a thing content. If I makeup many words for the new article, as they are actually not useful, so will decrease the quality of the article.
Thanks to all of you.
Then how about those contents that are completely unrelated to our product, such as chatgpt, AI. Should I move them to another site completely?
Our main business is selling software to recover corrupt MS Word documents. Meanwhile, we also publish some Word-related articles, such as:
- How to export all images in a Word document.
- How to convert a Word table into an Excel table.
These articles are not very closely relevant to our main business. But they are ranking very high on some keywords and get a lot of traffic. While our main product is not ranking very well on the main keyword, neither does it get much traffic.
So, my question is:
- Whether the not-closely-relevant articles actually hurt the SEO of our main product?
- Should we simply remove the not-closely-relevant articles? Or put them on another domain and put some backlinks to our main product?
Thank you for your suggestions.
Actually I do have unique landing page for each product, for example,
Meanwhile, I create order page for each product with multiple purchase options and information, for example, for above products, the order pages are:
If merge all order links for all products into one page, then I cannot put so many purchase options and information.
Hi, @ms (3244),
We have 30 products in total. And one order page(www.datanumen.com/order/) is like what you said, containing all these 30 products and one can click order to proceed with the checkout page.
However, we also create a separate order page for each product, such as www.datanumen.com/outlook-repair-order/, to provide more purchase options, including:
- Two resellers so that user can choose from one of them.
- Phone order options.
- Order via distributors, etc.
The above 1, 2, 3 info are mostly same among all the products, which cause duplicate contents.
Hi, @binayjha (2841)
Thank you for your reply. It is rather difficult to create unique contents for order page, since the only difference among these order pages are the product name and prices.
I have a large WordPress blog with thousands of posts. By default, the blog homepage contains the excerpt of each posts. As there are so many posts, the homepage is paginated(Totally 1341 pages)
I use siteliner to check and find a lot of duplicate contents on the blog homepage. So, what should I do with it? Should I noindex the homepage and all the paginated pages accordingly?
I have a website containing 30 software products. Each product has an order page. The problem is that the layout and content of these 30 order pages are very similar, except for the product name. Siteliner has reports these pages as duplicate contents.
I am thinking of noindex these pages. However, in such a case, if a user search for "xxx order page", where xxx is our product name, then he will not be able to see the order page of our product, which drives the revenue go away.
So, how to deal with such a case?
Sometimes, I notice the data from Moz, Semrush and Ahrefs are inconsistent.
For example, Ahrefs shows there are a great decrease of the referring domains in Nov 2021, but Semrush does not show such a trend.
And Moz shows a trend of traffic decrease from Nov 2021 to Feb, 2022, but Semrush show an increase instead.
Why? And how to determine which data source is accurate and reliable?