Norconex Crawlers (or spiders) are flexible web and filesystem crawlers for collecting, parsing, and manipulating data from the web or filesystem to various data repositories such as search engines.
-
Updated
Nov 9, 2025 - Java
Norconex Crawlers (or spiders) are flexible web and filesystem crawlers for collecting, parsing, and manipulating data from the web or filesystem to various data repositories such as search engines.
Norconex Filesystem Collector is a flexible crawler for collecting, parsing, and manipulating data ranging from local hard drives to network locations into various data repositories such as search engines.
Add a description, image, and links to the filesystem-crawler topic page so that developers can more easily learn about it.
To associate your repository with the filesystem-crawler topic, visit your repo's landing page and select "manage topics."