Screaming Frog Log File Analyser Update – Version 5.0
Screaming Frog Log File Analyser Update – Version 5.0
We’re pleased to announce the release of the Screaming Frog Log File Analyser 5.0, codenamed ‘by the Sea’.
If you’re not already familiar with the Log File Analyser tool, it allows you to upload your server log files, verify search engine bots, and get valuable insight into search bot behaviour when crawling your website.
We’ve been busy working on heavily requested features and improvements. Here’s what’s new.
1) Updated Bot Verification
Search bot verification for Googlebot and Bingbot has been updated to use their public IP lists, which were kindly provided by the search engines, rather than performing a reverse DNS lookup. This means you can verify search bots almost instantly – saving loads of time.
Other search bots, such as Baidu or Yandex still go through reverse DNS verification, so remove these from the default selection of a new project to speed up the process further, if they are not required.
2) Import Custom Log Formats
The Log File Analyser has been upgraded to support a wider variety of log file formats automatically, and now provides the ability to view and customise fields to be used within the log file. While the LFA can automatically parse virtually all log formats, this is useful when log files are extremely customised, or a required field is missing.
You’re able to preview what’s in the log file and which log file components have been selected against which fields in the tool, and adjust if necessary. To enable this feature, select the ‘Show log fields configuration window on each import’ option under the ‘New Project’ configuration.
You can view what log file components are used for each field, customise or debug. This is an advanced feature, and in general is intended for more complex use cases.
3) Import CSVs & TSVs
It can be a struggle to get hold of log files, and when you finally do – the log file data can be provided in random formats that are not raw access logs, such as CSVs or TSVs. Rather forcing you to try and get raw logs, you can now just upload these file types directly into the Log File Analyser.
Just drag and drop them into the interface in the same way you import log files and the LFA will automatically detect the log components, and upload the events for analysis.
4) Dark Mode
If analysing log files wasn’t cool enough alone (and you hadn’t already realised from all the screenshots above), you can now switch to dark mode in the Log File Analyser.
To switch to dark mode, just hit ‘Config > User Interface > Theme’ in the top menu.
5) URL Events over Time
The ‘URLs’ tab now has a lower window ‘Chart’ tab, which shows events over time in a graph when you click URLs.
This makes it easier to visualise crawling activity and trends of a page, than having to sift through the raw data in the ‘Events’ tab for each URL.
Similar to the already released Include configuration, you’re now able to provide a list of regexes for URLs to exclude from importing to further help focus on the areas you’re interested in analysing.
7) Countries Tab
There’s a new ‘Countries’ tab, which shows data and a map of activity based upon the IPs from a log file.
This can be used in combination with the user-agent global filter to monitor whether a search engine is crawling from one specific location for example.
8) Apple Silicon & RPM for Fedora
We’ve introduced a native Apple Silicon version, for those using shiny M1 macs and an RPM for Fedora Linux users.
In limited internal testing, we found that the native Silicon version was up to twice as fast importing log files than the emulation through Rosetta.
These will be introduced for future versions of the SEO Spider as well.
Version 5.0 also includes a number of smaller updates, security and bug fixes, outlined below.
- The LFA has been updated to Java 17.
- JSON support has been significantly improved – As well as supporting one JSON blob per line logs, the LFA can also handle a single JSON blob that has an embedded array of log events. The LFA can now parse log files where a single JSON field or a CSV/TSV field is effectively a whole log line embedded. i.e. where an Apache log line is embedded as a single JSON value in a JSON log.
We hope the new update is useful.
If you’re looking for inspiration for log file analysis, then check out our guide on 22 ways to analyse log files for SEO.
As always, thanks to the SEO community for your continued support, feedback and feature requests. Please let us know if you experience any issues with version 5.0 of the Log File Analyser via our support.
Small Update – Version 5.1 Released 20th June 2022
We have just released a small update to version 5.1 of the Log File Analyser. This release is mainly bug fixes and small improvements –
- Add support for CSV files containing JSON values.
- Add restart button to Workspace configuration dialog.
- Fix regresssion importing request lines that do not contain the HTTP version.
- Fix issue with restart sometimes not working on Windows.
- Fix crash showing advanced import dialog.
- Fix crash exporting tab with no results.
- Fix crash importing binary file.
- Fix crash importing project from older versions.
Small Update – Version 5.2 Released 7th September 2022
We have just released a small update to version 5.2 of the Log File Analyser. This release is mainly bug fixes and small improvements –
- Add support for importing TSV files with embedded JSON.
- Add support for some new timestamp formats.
- Fix issue preveting log files with uppercase protocol prefixes being imported.
- Fix crash importing log with unsupported timestamp.
- Fix crash analysing logs.
- Fix double quotes in unable to import log file dialog.
- Fix crash displaying context menu.