Indexing refers to the process of collecting website data in a structured format (i.e. search index) that is optimized for a search algorithm. When developing a search engine infrastucture, developers need to devise a system for indexing existing website content and new content as it is published.
While many third-party search providers require site owners to pass website information to their search index via .xml or data feed, Swiftype indexes website content with a high performance web crawler, Swiftbot. The regularity of this indexing process varies based on your pricing plan. Free users receive partial recrawls of their website content each day, with the option to initiate full recrawls once a week. Enterprise customers can initiate full recrawls each day, with the option to enable Constant Crawl to ensure that Swiftbot immediately indexes new website content as it is published.