[Broken Link Checker] Broken link checker that works for all sites in multisite install

0

It would be really great if we could see all broken links across the network at a network level.

  • Patrick Freitas
    • FLS

    Hi Troy Media

    I hope you are doing well.

    Thank you for your request, we forwarded it to the plugin team.

    The local engine is easier to implement such a feature but since we only add the main site on HUB it would have some limitation, it can also timeout in big networks ( we are already working on some improvements about timeout ).

    Best Regards
    Patrick Freitas

    • Williams Valerio
      • Staff

      Hi TGL ,

      I hope you’re doing well and thanks for reaching us.

      While the New Broken Link Checker is cloud-based at least for the scan feature, please keep in mind that there is still a legacy version that is not cloud-based and still available to be used and there is still a plugin needed to act as a bridge for certain actions like edit the link or unlink it from your content.

      Also, please keep in mind that this thread is related to the feature described in the thread description, anyways, you are totally free to create a new ticket for support here in case you got any issues or if you need further help with a specific topic.

      This is more than all to keep everything in order and be able to provide you with the best assistance and offer the best information possible.

      Best regards,
      Williams

  • Patrick Freitas
    • FLS

    Hi TGL

    The screenshot you shared is correct.

    However, let me explain the main difference between the two engines.

    The local is going to scan your database and try to locate the links from there and then make a request, this is a good idea but also depends on how the links are generated, for example, if it is a dynamic link it probably will skip.

    The Cloud engine is much faster since it works similarly to Google or any other crawler, it is going to scan the front end for links and tests them, the benefits of this method are:

    – We don’t loop into the database, so no extra server resources are being used.
    – Dynamic links are included as well.

    This can bring more links to the test and we do have a timeout limit which is around 3h.

    Huge network and depending on how fast the pages return to the cloud scan it can reach this timeout limit and then fail the scan.

    In this matter, our Sysadmins are already working on some improvements as I explained in the initial response.

    So as of now if we try to scan the subsites it is possible would probably face the described situation.

    But we also forwarded your request to our plugin team.

    Best Regards
    Patrick Freitas