Viewing 4 replies - 1 through 4 (of 4 total)
  • Thread Starter vijayasawant

    (@vijayasawant)

    how to redirect broken url to directory which can prevent accessing googlebot from crawl using robot.txt file

    Plugin Author Huseyin Kocak

    (@flashcentury)

    Hi,

    Broken Link manager for this job.

    The first method
    – Click the menu to Broken URLs
    – Write to redirected link (See the screenshot)
    – Finally, click the Add button

    https://brokenlinkmanager.com/screenshot/Screenshot_17.26.02.png

    The second method (Bulk)
    – Click the menu to Settings
    – Mark to Redirect default URL
    – Write to Default redirected URL
    – click to save button

    https://brokenlinkmanager.com/screenshot/Screenshot_17.34.04.png

    Thread Starter vijayasawant

    (@vijayasawant)

    Hello,

    I created directory on the server and redirected broken links to that directory as mentioned by you. I disable email option. I am still getting emails. The directory has 707 permission. I don’t see any urls are automatically redirected over there.

    What could be the reason?

    Regards,
    Vijaya

    Thread Starter vijayasawant

    (@vijayasawant)

    Hello,

    I am little bit confused with this plugin. I have created redirection https://www.oratechsolve.com/BrokenUrls/ The file on the server does not show any logs. The broken link manager shows total 182 broken urls. Why I am not seeing these urls? How do I permanently remove these urls from Google search? I removed these urls from google webmaster tools but it is still reappearing at Broken Link Manager.

    Google says that I have to put the file in robot.txt not to access by google. How can I confirm the default url setup for broken url is correct and working?

    Thank you very much in advance,

    Vijaya

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Broken links appear again after removal from google cache’ is closed to new replies.