Crawler
A crawler automatically updates the list of websites. Only registered partners at GetKirby.com can use this website's API.
How the Crawler works
The crawler performs multiple steps at certain intervals to update the list of reports. The crawler is polite, respects robots.txt and throttling and identifies itself to be related to this service.
Partners' websites are automatically added and linked to their user accounts if they are listed on their GetKirby partner profile or if I could create a simple enough crawler for their own portfolio.
Hourly
- Get any missing screenshots for published websites.
Nightly
- Find all partners listed on GetKirby.com.
- Extract their public name, email, portfolio website, partnership status and referenced websites.
- Synchronize active partners with user accounts.
- Add partners' websites as drafts and link them to their user accounts.
- Crawl the official showcase for websites.
- Crawl the awesome KirbySites.com for even more websites.
- Crawl portfolios of a few partners for their websites (if DOM structure was easy enough).
- Perform a check if the added websites use Kirby and only make those public that do.
- Go and retrieve the meta title, description, and, within the next hour, a screenshot for each published website.
Weekly
- Update the screenshot of a website unless a custom one was uploaded via the API.
- Check if sitemap indicates a change and store date-time.
Monthly
- Perform a check if all listed websites still use Kirby and only keep those public that do.
- Update the meta title and description for all websites.
- Run a single pass, performance report on Webpagetest.org for each listed website and store the results. This is a test only on the top level URL and not done for every content page. Analyzing subpages is part of my audit.
- The report will include a Google Lighthouse sub-report with all core web vital scores and set the current screenshot, if no custom one was provided using the API.
- Websites linked to partners get an additional SEO report created via Seobility.com. Their service checks the technical implementation and the quality of the content.
Why run the reports only monthly?
Because running reports costs me money and currently I can not offer shorter intervals.
If you hire me to create an audit for your website the reports will be updated as needed.
How consistent are the reports?
If you use the services to generate the reports yourself, you will most likely have different scores. This is to be expected and influenced by various factors.
For one, it strongly depends on the quality of your hosting and the duration they keep the PHP-fpm running and loaded files "hot" in RAM after the last visitor. A "cold" website will yield far worse results than one with a visitor just a few minutes ago. This is why my crawler pings the website once nightly (CEST) and immediately after the tests are run to minimise side effects.
Further, your device and internet connection might change things up a bit. However, with a third-party service like Webpagetest.org, the parameters of the testing system and connection quality are consistent, logged, and configurable if needed. All reports generated can be compared across different execution dates with reasonable accuracy.
Some hosting providers intentionally delay the report services, resulting in very bad TTFB. But that does not pose a big issue as when doing an audit as I will run multiple passes both automated and manually to create my dataset.
Consider the reports listed here as indicators of potential issues rather than final conclusions.