Community Information
-
•
Benefits of crawled text vs negatie 404 errors - which wins?
I've taken over a client site who have interactive catalogues which regularly get changed. These catalogues are self-contained html files with hospot images all hosted on a subdomain. Problem is they are indexable, and as they get updated with new editions (all with different html structues internally - there is no way of remapping every one with out it being a ballache. What this means is I've now got thousands of 404's on the site. So what is the greater good: add a robots disallow rule across the entire sudomain and miss out on the seo benefits of the crawled text or lose this but don't get the 404's?4
© 2025 Indiareply.com. All rights reserved.