How to mark a concrete article as noindex, so it could be desindexed from Google?

Hello,

An author has asked us to “desindex” his article form Google. He does not want to delete the article, he just want that his article does not appear anymore in Google searches.

The journal is a super dead journal since 2011.

Is it possible to add the “noindex” tag to just one article? If it is possible, how and where can it be added?

We are using OJS 3.1.1.4 in a multijournal and multiinstitutional platform that it will soon be updated to 3.3 version. We do not have direct access to the database, but we could ask the institution to do it.

I know it is an unusual request, because usually people want to be indexed, not to be desindexed. We could just say him it is not possible, but I wanted to ask if someone has solved a similar situation.

Thanks.

Yes, it is possible to add a “noindex” meta tag to a single article which instructs search engines not to index that particular page.

  • Locate the template file for article detail pages within your OJS installation. It is typically named something like article_detail.tpl or similar, depending on the version and theme you are using.
  • Add the following line within the <head> section of the HTML for that template:
<meta name="robots" content="noindex">
  • Once the “noindex” tag has been added, it may take some time for Google to re-crawl the page and update its index to reflect this change.
  • You can expedite this process by using Google’s Search Console to request a re-crawl of the specific page.

There are other methods, but it is much more complicated.

1 Like

@Lazar_Stosic - that would mark all articles as noindex. .tpl files are templates for the articles, not the article itself.

@RCUB - I could think of the following ways:

  • if the sitemap has been submitted already to Google Search Console: Create a delete request for the URL of that specific article. However, this is only valid for about 6 months
  • create a Disallow rule for the URL of that specific article in robots.txt (it is in the OJS root directory). However, this takes some time* until bots recognise that.
  • create a blocking rule that sends a 404 HTTP Status for the GoogleBot user agent and the specific URL in the .htaccess file. However, this takes some time* until the article is desindexed.

Frankly, I don’t understand why the author has this unusual request. Is the author afraid of something? The Web never forgets. The article may also be indexed in many bibliographic databases.

*some time = weeks, months, years …

1 Like

Thank you @Lazar_Stosic so much for your answer!
I am going to check with the administrators of the OJS whereas yours or @mpbraendle options could be implemented.

Thank you also @mpbraendle so much for your answer!

Yes, it is a very unusual petition. The article is more of an opinion essay rather than a peer reviewed article in a students journal, and it is from almost 20 years ago. Maybe the author has seen it as a youth error?

I think I am going to check his motives first and maybe go to COPE forums to see examples of similar cases. Perhaps the first thing to think is whereas the article should be desindexed or removed or not.

Thank you very much again for your time!