Screaming Frog https://www.screamingfrog.co.uk/ Wed, 14 Sep 2022 08:38:52 +0000 en-US hourly 1 https://wordpress.org/?v=5.9.4 How to Write Product Copy That Sells https://www.screamingfrog.co.uk/how-to-write-product-copy/ https://www.screamingfrog.co.uk/how-to-write-product-copy/#comments Wed, 14 Sep 2022 08:38:52 +0000 https://www.screamingfrog.co.uk/?p=202652 A shopper enters a store. This isn’t a joke, don’t worry. They want a vacuum cleaner, so they pick up the display model. It’s light. They grip the handle. It’s comfortable. They check the size and capacity of the bin and give it a test spin across the shop floor....

The post How to Write Product Copy That Sells appeared first on Screaming Frog.

]]>
A shopper enters a store. This isn’t a joke, don’t worry.

They want a vacuum cleaner, so they pick up the display model. It’s light. They grip the handle. It’s comfortable. They check the size and capacity of the bin and give it a test spin across the shop floor.

“Yep, it’ll fit in the cupboard”.

They buy it.

Another shopper lands on an ecommerce website to buy the same vacuum. How do they achieve the in-store experience?

Copy. And images. But mostly copy. On-site copy needs to give shoppers everything they need to make a confident purchase.

Which challenges does the product solve? How big and heavy is it? What are the key features?

Ecommerce sites need copy that sells.


Easy to Read

Online shoppers don’t have time to wade through paragraphs of copy. They want to find out what makes the product right for them. And quickly.

So, keep it simple. Call out the key benefits of the product. Create scenarios and explain how the product solves common challenges. Does it save shoppers time on daily chores? Maybe it’s made with a more comfortable material than alternative products.

Aim to convert them within one minute. This means short sentences, bold selling points and no long paragraphs of text.

But don’t take my word for it. Google’s John Mueller recently confirmed that “fluff content” can make it “hard for search engines to figure out what you’re trying to say”.

If you want your products to rank on search engines – and customers to find them – keep it snappy and focus on the USPs.


Formatting

When it comes to product copy, formatting is as important as content.

Even impeccable copy gets lost in chunky paragraphs at the bottom of pages and shoppers won’t hesitate to jump to a competitor page if they can’t find what they’re looking for.

So, how should you format product copy?

Keep it natural. This means letting the information dictate its presentation. Break copy into sections that make sense, for example:

  • Key features – the main product benefits at the top of the page
  • Body copy – the main copy section, broken into small, scannable sentences creating user scenarios
  • Product spec – the technical information, like dimensions, weight and accessories, possibly in a digestible table
  • User info – which may include FAQs, reviews and more

Not every category of product will need the same sections, so the key is choosing those that make sense for the page and allow the user to easily find what they’re after.

Imagine you’re shopping for the product. What information would you like to know? The price? The material? The size? Why it’s better than alternative products?

Answer these questions and group them into natural sections.

Also, remember to leave plenty of white space on the page and top-load your content. Studies show online readers scan copy in an ‘F’ pattern, so the important stuff needs to be at the beginning of each sentence.

This means keeping copy active with short, snappy sentences filled with calls to action, like:

“Browse our range of accessories”

“Choose the perfect colour for your living room”

This drives action compared with:

“We have a range of colours available so there is a choice for every living room”


Solve Customer Problems

Why do people buy things? To solve their problems. Whether it’s a serious dilemma like how to clean their kitchen or fix their car, or a personal conundrum like ‘what do I wear this weekend?’.

Copy should provide the solutions. Tell short stories with your product copy to help the shopper picture how the product improves their life. For example:

“Save time on household cleaning with our new vacuum”

“Kit out your summer wardrobe with our floral dresses”

Copy should focus on solutions, not attributes. For example:

“Carry all your books, stationery and gadgets with this backpack”

Rather than:

“Large bag”

Every word is valuable. Don’t waste copy on empty adjectives like ‘good’, ‘nice’ or ‘excellent’. Instead, use each word to explain why the product is all these things.

Use tools like Answer the Public for extra inspiration – to find out which questions people are searching around your product. For example, searching ‘handbag’ may give you inspiration for your copy by revealing searches like ‘best summer handbags’ or ‘best material for handbag’.


Rank on Search Engines

Return customers and fans of your brand know exactly where to find your products. But what about those who haven’t shopped the brand or website before?

Product and category pages need to rank well on search engines for shoppers to find them. This means they need to provide value for the reader and naturally target keywords.

Keywords

Product pages are likely to lean heavily on a single target keyword, like ‘tennis ball’ or ‘chocolate bar’. These should be naturally included throughout the page, where relevant, without keyword stuffing.

While product copy is unlikely to be the most optimal place to target all informational searches – like ‘which is the best tennis ball’ or ‘difference between tennis balls’ – it’s perfect for targeting transactional queries. You can always create separate guides targeting these informational terms and include links to your product page in the call to action.

Transactional queries often include long-tail keywords that show the user is ready to convert, like ‘where to buy tennis balls London’ or ‘cheapest tennis balls for sale’. Remember, these long-tail terms can have low search volumes, but their intent means they are just as valuable, so don’t get hung up on numbers.

Consider how you can include as many of these unique and relevant key terms in your product copy. This may include simply working them naturally into body copy, using them as content headings or including an FAQ section. But remember, they must flow naturally in full prose and not be stuffed into descriptions, as this will be flagged as spam.

E-A-T

Another search engine must-have is expertise, authority and trust (E-A-T). Search engines – and users – want to know the product is high quality and manufactured and sold by industry experts, so they know where their money is going.

There are plenty of ways to demonstrate E-A-T on product pages, including adding user-generated content to show products in action. But we can also do it through copy.

Think about the information that gives shoppers confidence to part with their money. This may include a detailed product spec table that features dimensions, materials and more. Similarly, leaning on experts adds authority to product descriptions, for example, are your tennis balls ‘used by professional athletes’?

Additional information that reassures shoppers may include ‘how to care for your product’ copy, expert quotes and links to further resources or guides that help shoppers make up their minds when browsing products.


Calls to Action

The call to action on product pages is obvious… buy the product. But that doesn’t mean they’re easy to write.

Consider conversion button copy and how it reflects the product and the way users typically shop. If they typically buy multiple, smaller value items, could ‘add to basket’ drive action over the traditional ‘buy now’? Similarly, could ‘fast checkout’ be more tempting for impatient shoppers?

If the product is a service or experience, reflect this in the copy, for example ‘book now’.

Calls to action can be used to upsell other products, too. Think about compatible products and internally link to them in the copy. This might look something like:

“Carry our tennis balls effortlessly from home to court with our portable cases, designed to hold up to six balls”.


For Your Inspiration

Let’s look at an example of a product page that brings this all together.

MyProtein nails product copy by understanding its audience and what they’re after. And it breaks its information down into neat, natural sections and keeps the technical stuff concise and scannable.

Here’s an example of product copy for one of MyProtein’s protein bars:

The main product overview covers the key points – the product’s taste and protein content – while the key benefits are scannable and digestible.

MyProtein also includes some common questions about the product with concise answers, as well as covering all the information users need – like when to eat the product and its ingredient profile and nutritional content – in collapsible boxes, so they can easily find what they’re after.

Now, let’s compare this to a competing protein bar page:

Consider everything an experienced athlete or first-time buyer would want to know when buying a protein bar. This may include the total protein content per serving, wider nutritional information and ingredient profiles and the benefits of supplementing protein in bar form.

Would this page be useful or likely to drive action among either audience? There’s nothing to set this protein bar apart from any other or convince users they need the product. Compared with the MyProtein copy, the user is completely in the dark.

Which bar would you rather buy?


Tying It All Together

All our top tips for product copy have something in common – user experience. Each element of your copy should be targeted to the reader.

This means considering how they read information, which questions they’re asking and what their challenges and pain points are.

Next time a product goes live on your site, follow our checklist for effective online sales copy:

  • Keep copy short, snappy and digestible – don’t write for the sake of words
  • Imagine you’re the buyer – what information and questions would you want covered?
  • Let the content dictate the format – no square pegs in round holes
  • Solve problems rather than talking about product features
  • Consider keywords and trust signals to rank on search engines
  • Keep CTAs relevant to the product and consumer

For more insights on how to create killer product copy or to transform your product pages today, enquire about Screaming Frog Digital Copywriting service.

The post How to Write Product Copy That Sells appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/how-to-write-product-copy/feed/ 2
Leveraging Paid Data for SEO Insights https://www.screamingfrog.co.uk/leveraging-paid-data-for-seo-insights/ https://www.screamingfrog.co.uk/leveraging-paid-data-for-seo-insights/#comments Fri, 26 Aug 2022 10:31:34 +0000 https://www.screamingfrog.co.uk/?p=207864 PlayStation or Xbox? Apple or Android? These debates have divided the friendship groups of millions around the world, but within the digital marketing world, it’s SEO and PPC that is sometimes a cause of contention. However, in a world seemingly full of divide, I’ve looked at how we can help...

The post Leveraging Paid Data for SEO Insights appeared first on Screaming Frog.

]]>
PlayStation or Xbox? Apple or Android? These debates have divided the friendship groups of millions around the world, but within the digital marketing world, it’s SEO and PPC that is sometimes a cause of contention.

However, in a world seemingly full of divide, I’ve looked at how we can help each other instead of causing more unrest. PPC data can be used to progress overall SEO efforts, resulting in an efficient and holistic approach to digital marketing.


PPC Fundamentals

For anyone new to PPC, pay-per-click is a model of digital marketing where, at least traditionally, advertisers pay a fee each time one of their ads is clicked, whilst utilising a myriad of messages, networks and targeting settings.

PPC could therefore be described (particularly by the SEOs among us) as “a way of buying visits to your site rather than attempting to earn those visits organically”, but there is lots of valuable data that can be unlocked when you pay Google money, most of which is immediate and highly transferable. So, without further ado.


Keyword Discovery

With Google now hiding the majority of organic keyword data within Google Analytics, bucketing it under ‘(not provided)’, SEO’s have been left to explore workarounds to ‘unlock’ this data. Whilst you’re still able to access some of it with Google Search Console, you’ll never be able to reveal everything.

*Inserts credit card details*

Paid keywords, however, have a great level of data on offer. Metrics such as sessions, time on page and goal completions are available in Analytics, and clicks, cost and conversions are ready to review in Google Ads.

Delving deeper into Google Ads, Search Term Reports can provide this data down to the exact term, and are therefore useful for several reasons:

  • It allows you to easily find top performing keywords that you can align your pages toward (look for both high volume and high converting terms).
  • It can help identify gaps in your content, especially around question-based terms (can, why, how, where, what).
  • You can find expensive but high-performing keywords that may make sense to target through SEO/content marketing, helping to free up ad budget.
  • You can also flag low performing keywords that should potentially be avoided (unless these provide lower commercial intent, that can be used at the top of the funnel or for informational content).

Finally, the Google Ads Keyword Planner allows you to bulk check the average monthly search volume for keywords across different countries and languages. It also details seasonal trends, which can be useful when diagnosing traffic drops.


Ad Copy

As we’re all aware, organic listings use the landing page metadata (aside from when Google feels like meddling that is) to create results on the SERPs, increasing its importance as the main reference point for users.

When creating ads, advertisers have the ability to take full control of what copy is included, as well as where that ad will send the user. This level of control means that advertisers know exactly what the user is seeing, how often they’re clicking it and how often a user goes on to convert, so overtime, you can identify the best performing messages.

Responsive Search Ads (the new default ad as of June 30th, 2022) can particularly hone this performance, where a maximum of 15 headlines and 4 descriptions, or 43,680 possible combinations, are tested repeatedly at every auction to identify what ad copy works best, and for what search term.

This can give those in the world of SEO great insight into what resonates with searchers for particular terms, and can therefore be used to inform metadata recommendations. This can include changes to page titles, meta descriptions and on site CTAs or copy.

The main caveat here is, just because copy works well for Google Ads it doesn’t mean it will work well for organic, but if your PPC team is sitting on this sort of information, it’s worth asking them (always after their morning cuppa).


Quality Score

Paid keywords are given a quality score out of 10 based on how they compared with competitors over the last 90 days. This is based on 3 components; expected CTR, ad relevance and landing page experience, where each is graded as below average, average or above average.

Quality score can therefore help identify weak links with regards to the headlines and descriptions on site, as well as your landing pages in general.

As an example, say a particular keyword has an above average score for ad relevancy and expected click through rate, but a below average score for landing page experience, this can used to help isolate an issue on the site, as well as highlighting which stage of the user journey your competitors are outperforming you.

Equally, if the expected CTR metric is above average, this highlights the language your users are receptive to, which can be replicated on your landing pages.


Paid & Organic Reports

Within the reporting realms of the Google Ads account, Paid & Organic Reports help advertisers review paid data alongside organic data.

This allows marketers to see collective data for their various listings, such as clicks, CTR and average position, across a number of search queries, whether that’s just organic, just paid or both. See below for what you can expect to find.

For PPC purposes, we are able discover additional keywords to target, particularly in instances where only organic results appear.

The same therefore applies to SEO, where you can see when ads are appearing (and if they’re performing well), but organic results are not.

Additionally, you can also spot where you rank highly for both organic and paid (or in some circumstances where organic is higher), which can help cut business costs should you decide to drop these ads.

Note– In order to review this report, your Google Ads account must be linked with Search Console.


More Quick-Fire Insights

  • Placements
    • Placement reports show a list of websites where ads have been shown across the Display and YouTube networks, providing associated click, cost and conversion data.
    • High performing ad placements can therefore help identify potential outreach and link building opportunities, where you have real world data to support these prospects.
  • DSA’s
    • Dynamic Search Ads use your website content to target your ads, generating headlines and landing pages based on on page information (sound familiar?).
    • If Google picks a page for a particular query, then chances are it’s the best page on your site for it.
  • Devices
    • PPC device reports allow users to view data across mobile, desktop, tablet and TV at the campaign, ad group and keyword level.
    • This allows you to isolate device performance down to the exact search term, where poor performance can be indicative of inferior UX.
  • Audiences
    • Paid networks provide performance data on a combination of pre-defined audiences, such as demographics, life events, job titles, interests and more.
    • This data can help direct changes on site with the aim of tailoring your landing pages, targeting and outreach to the right people.
  • Auction Insights
    • Auction insights reports let you compare metrics such as impression share, overlap rate and outranking share with other advertisers who are participating in the same auctions that you are.
    • This can not only help identify new competitors, but with data segmented by device, it can help trigger UX improvements should metrics decline on a particular device.

Summary

There are plenty of other examples where paid data, and indeed other channels within online marketing, can be used to offer insight to the world of SEO, but hopefully this helps widen your horizons when looking for areas of optimisation.

If you have any other paid media examples that will help benefit the topic of this report, then please comment below.

Now go bug your PPC department, they’ll love to show off.

The post Leveraging Paid Data for SEO Insights appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/leveraging-paid-data-for-seo-insights/feed/ 3
Screaming Frog SEO Spider Update – Version 17.0 https://www.screamingfrog.co.uk/seo-spider-17/ https://www.screamingfrog.co.uk/seo-spider-17/#comments Wed, 17 Aug 2022 07:33:58 +0000 https://www.screamingfrog.co.uk/?p=205662 We’re pleased to announce Screaming Frog SEO Spider version 17.0, codenamed internally as ‘Lionesses’. Since releasing the URL Inspection API integration in the SEO Spider and the launch of version 5 of the Log File Analyser, we’ve been busy working on the next round of prioritised features and enhancements. Here’s...

The post Screaming Frog SEO Spider Update – Version 17.0 appeared first on Screaming Frog.

]]>
We’re pleased to announce Screaming Frog SEO Spider version 17.0, codenamed internally as ‘Lionesses’.

Since releasing the URL Inspection API integration in the SEO Spider and the launch of version 5 of the Log File Analyser, we’ve been busy working on the next round of prioritised features and enhancements.

Here’s what’s new in our latest update.


1) Issues Tab

There’s a new ‘Issues’ right-hand tab, which details issues, warnings and opportunities discovered.

This tab has been introduced to help better direct users to potential problems and areas for improvement, based upon existing filters from the ‘Overview’ tab.

Screaming Frog SEO Spider Issues Tab

An in-app explanation of each issue and potential actions is provided in English, German, Spanish, French and Italian.

Each issue has a ‘type’ and an estimated ‘priority’ based upon the potential impact.

  • Issues are an error or issue that should ideally be fixed.
  • Opportunities are potential areas for optimisation and improvement.
  • Warnings are not necessarily an issue, but should be checked – and potentially fixed.

E.g – An ‘Internal URL Blocked by Robots.txt’ will be classed as a ‘warning’, but with a ‘High’ priority as it could potentially have a big impact if incorrectly disallowed.

For experienced users, the new Issues tab is a useful way to quickly identify top-level problems and dive straight into them as an alternative to the Overview tab. For users with less SEO expertise, it will help provide more direction and guidance on improving their website.

All Issues can be exported in bulk via ‘Bulk Export > Issues > All’. This will export each issue discovered (including their ‘inlinks’ variants for things like broken links) as a separate spreadsheet in a folder (as a CSV, Excel and Sheets).

Bulk Export All Issues

The new Issues tab also works with crawl comparison similar to the Overview tab, to allow users to identify where issues have changed and monitor progress over time.

Crawl comparison for Issues

The SEO Spider has always been built for experts and we have been reticent to ever tell our users how to do SEO, as SEO requires context. The new issues tab does not replace SEO expertise and a professional who has that context of the business, strategy, objectives, resource, website and nuances related to SEO and prioritising what’s important.

The new issues tab should provide helpful hints and support for an SEO who can make sense of the data and interpret it into appropriate prioritised actions.


2) Links Tab

There’s a new ‘Links’ tab which helps better identify link-based issues, such as pages with a high crawl-depth, pages without any internal outlinks, pages using nofollow on internal links, or non-descriptive anchor text.

Screaming Frog SEO Spider Links Tab

Filters such as ‘high’ internal and external outlinks and non-descriptive anchor text can be customised under ‘Config > Spider > Preferences’ to user-preferred limits.

All data can be exported alongside source pages via the ‘Bulk Export > Links’ menu as well.


3) New Limits

Users are now able to control the number of URLs crawled by URL Path for improved crawl control and sampling of template types.

Under ‘Config > Spider > Limits’ there’s now a ‘Limit by URL Path’ configuration to enter a list of URL patterns and the maximum number of pages to crawl for each.

Limit by URL Path

In the example above a maximum of 1,000 product URLs will be crawled, which will be enough of a sample to make smarter decisions.

Users can also now ‘Limit URLs Per Crawl Depth’, which can help with better sampling in some scenarios.


4) ‘Multiple Properties’ Config For URL Inspection API

The URL Inspection API is limited to 2k queries per property a day by Google.

However, it’s possible to have multiple verified properties (subdomains or subfolders) for a website, where each individual property will have a 2k query limit.

Therefore, in the URL Inspection API configuration (‘Config > API Access > GSC > URL Inspection’) users can now select to use ‘multiple properties’ in a single crawl. The SEO Spider will automatically detect all relevant properties in the account, and use the most specific property to request data for the URL.

URL Inspection API Multiple Properties Config

This means it’s now possible to get far more than 2k URLs with URL Inspection API data in a single crawl, if there are multiple properties set up – without having to perform multiple crawls.

Please use responsibly. This feature wasn’t built for the purpose to circumvent Google’s limits and motivate users to create many different properties to get indexing data for every URL on a website. Google may adjust this limit (to domain etc) if it’s abused.


5) Apple Silicon Version & RPM for Fedora

There’s now a native Apple Silicon version available for users on M1/2 macs, and an RPM for Fedora Linux users.

Apple Silicon Version

In limited internal testing of the native Silicon version we found that:

  • Crawling a locally hosted site with very little latency resulted in the crawl running twice as quickly.
  • Loading in a saved crawl was 4 times faster.

The native experience is just so much smoother overall in comparison to using the Rosetta2 emulation layer.


6) Detachable Tabs

All tabs are now detachable. Users can right-click and ‘detach’ any tab from the main UI and move to a preferred position (across multiple screens).


This is pretty cool for users that like to keep one eye on a crawl while doing other things, or analysing all the things at once.

There are options to ‘re-attach’, ‘pin’ – and reset all tabs back to normal when it’s all too much, too.


Other Updates

Version 17.0 also includes a number of smaller updates and bug fixes.

  • The ‘Response Codes’ tab now has an additional ‘Internal’ and ‘External’ filter, which is reflected in the Overview tab and Crawl Overview Report. Thanks to Aleyda for the nudge on that one!
  • Tabs can be re-ordered more efficiently by dragging, or via a new ‘configure tabs’ menu available via a right-click on a tab or the tab down arrow.
  • The main scheduling UI has been updated with details of task name, next run, interval and has improved validation for potential issues that might stop it launching.
  • There’s a new ‘Save & Open’ option for exports, which works for all formats.
  • GA & GSC configs now have date pre-sets (last week, last month etc) which can be saved as part of the configuration and supplied in scheduled crawls.
  • The PSI integration now has Interaction to Next Paint (INP) and Time to First Byte (TTFB) metrics available.
  • The ‘Ignore Non-Indexable URLs for On-Page Filters’ configuration has been updated to ‘Ignore Non-Indexable URLs for Issues‘ and will by default, not flag non-indexable URLs for checks related to issues across a wider range of tabs and filters as detailed in-app and in the user guide.
  • The hreflang tag limit has been increased from 200 to 500, for websites that have lost the plot lots of alternative localised URLs.
  • All new tabs, filters and issues, warnings and opportunities counts are available in the ‘Export for Data Studio’ and Data Studio Crawl Report via scheduling.
  • The SEO Spider has been updated to Java 17.
  • Fixed lots of small bugs that nobody wants to read about, particularly when everyone is on holiday and nobody is reading this.

We hope the new update is useful.

A big thank you goes out to our beta testers who have helped with the different language versions of our new in-app issue descriptions. In particular, MJ Cachón for translation help, a new Spanish Google Data Studio crawl template – and generally being awesome.

As always, thanks to the SEO community for your continued support, feedback and feature requests. Please let us know if you experience any issues with version 17.0 of the SEO Spider via our support.


Small Update – Version 17.1 Released 23rd August 2022

We have just released a small update to version 17.1 of the SEO Spider. This release is mainly bug fixes and small improvements –

  • Fix issue preventing scheduling working on Ubuntu.
  • Fix issue preventing JavaScript crawling working on Ubuntu.
  • Fix cookies issues around forms based authentication.
  • Fix JavaScript crawling failing to render some pages.
  • Fix crash when changing filters in the Response Codes tab.
  • Fix crash crawling URLs with HTTP in the domain name.
  • Fix crash selecting ‘Save & Open’.

Small Update – Version 17.2 Released 12th September 2022

We have just released a small update to version 17.2 of the SEO Spider. This release is mainly bug fixes and small improvements –

  • Reinstate auto scrolling behaviour at the bottom of table view, so users can see URLs appear as they are dicovered. Thanks to Pete Mindenhall, Steve Morgan and a few others that spotted it.
  • Add handling for webp files in bad content filter.
  • Fix regression in cookie handling not showing all cookies.
  • Fix issue showing incorrect values for Total Internal/External URLs when ignoring URLs blocked by robots.txt.
  • Fix issue with last mode used not being persisted after restarts.
  • Fix issue with new Issues tab graphs not updating during a crawl.
  • Fix issue with new task dialog showing single digits for time.
  • Fix issue with selected filter being lost when returning to a tab.
  • Fix issue with old Response Codes 5XX export tab not working on upgrade.
  • Fix issue with Spelling & Grammer crashing when used on Apple Silicon.
  • Fix issue with Response Codes 4XX filtering showing 2XX responses after re-spider & reload.
  • Fix issue with grabled fonts affecting users with their own version of Roboto font installed.
  • Fix crash in view source panel.
  • Fix crash crawling robots.txt with long Sitemap: line.
  • Fix crash around detachable tabs and full screen on macOS.
  • Fix crash showing ‘Visualisations > Force-Directed Crawl Diagram’.
  • Fix crash removing/re-spidering URL.
  • Fix crash viewing scheduling history.


The post Screaming Frog SEO Spider Update – Version 17.0 appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/seo-spider-17/feed/ 43
Reactive PR: How Preparing for the Journey Will Help You Reach Your Destination https://www.screamingfrog.co.uk/preparing-for-reactive-pr/ https://www.screamingfrog.co.uk/preparing-for-reactive-pr/#respond Fri, 22 Jul 2022 08:40:52 +0000 https://www.screamingfrog.co.uk/?p=202641 If you’re heading off on holiday this summer, you’ve likely followed all the necessary steps to ensure it will be one to remember, such as researching and booking the best place to stay, the best modes of transport and things to do while you’re there. But while you might have...

The post Reactive PR: How Preparing for the Journey Will Help You Reach Your Destination appeared first on Screaming Frog.

]]>
If you’re heading off on holiday this summer, you’ve likely followed all the necessary steps to ensure it will be one to remember, such as researching and booking the best place to stay, the best modes of transport and things to do while you’re there.

But while you might have made a plan, nothing beats some spontaneity – after all, it’s these spur-of-the-moment decisions that can make your break from reality even better than expected.

Does that sound like something you can relate to as a PR professional?

While it might be hard to compare the daily grind to planning a dream vacation, it’s time we discussed how we can bolster some of that super organised yet relaxed holiday planning energy into our PR planning, and reactive PR in particular.

Think of your month-to-month PR plan, such as scheduling content for your client, as your holiday plan. This contains the things you’ve booked but can’t afford to take risks on, such as press releases or key topics your client wants to cover. Then, think of reactive PR as the spontaneous moments of your trip that are good at filling the downtime, come about randomly and enrich your campaign.

You’ll probably agree that being spontaneous all the time can be exhausting, especially if you’ve already got a jam-packed schedule. To sustain a reactive element within our client campaigns, we need to start preparing for it and being familiar with what’s coming up that we can react to.

You might find that preparing for the reactive journey will make reaching the destination even more worthwhile.


Doing Your Research

Before booking your holiday, you might like to familiarise yourself with hotel reviews, check social media and look at events happening in your area. This can help you decide on where you go, when you go and what you do while you’re there.

As a PR professional, it’s a good idea to be aware of the general news flow, check for planned awareness days or review trending topics that your experts can jump on in advance. This helps you prepare for the potential pieces of content your team can produce.

But while you might use tools such as TripAdvisor, Google Reviews or local guides to find out more about where you’re planning on going, there are ways you can utilise online tools to discover what topics are trending in the news.

Look to social media sites like Twitter, Instagram or TikTok to see what people are talking about. This helps you keep up to date with the latest headlines and provides an opportunity to look at what your competitors are doing.

Just as finding things to do on your trip is a good idea, getting to grips with the kind of content journalists are looking for sets you up for a successful month. While jumping on these might mean diverging from your original plan, you’re much more likely to position your client’s expert opinion right where it needs to be.

While the nature of reactive PR is not to be planned down to the final detail, being aware of what’s going on in your client’s sector means you can react quickly to relevant stories as and when they come up.


Getting There

We’re all booked, we know where we’re going and the best things to do while we’re there. But it’s as much about the journey as it is the destination.

With reactive PR the goal is to get the content out there as quickly as possible. For a smooth ride, we need to have all the components in place, such as knowledge of the client’s sector, their target audience and an idea of which publications and journalists are most likely to publish your content. Our media lists are what pull this all together.

As reactive PR often means improvising new content, this may mean your existing media lists don’t quite cut it. Be sure to review them before blindly sending out content. Check the relevancy of your contacts and search for new journalists that have been writing about similar topics, using research tools to make sure it’s getting to the right place at the right time for maximum coverage.

Remember that how you pitch your idea to journalists is just as important as who you send it to. Rather than sending a blanket email, consider how you can tailor your pitch to suit different outlets or journalists. A personal, targeted approach is much more likely to be well received, and even if they don’t use it, your extra efforts can help you gain respect and future work.

Just as having a bad flight can cause you to leave a bad review or not use an airline again, your outreach efforts could sour the effects of reactive commentary for your client. A poor approach to outreach can also result in a journalist devaluing you as a contact.

Get it right, and you arrive at your destination with plenty of coverage, new valuable contacts and a happy client.


Living in the Moment

After taking time to put together some reactive commentary, the feeling of opening your computer and finding stacks of coverage can feel just as euphoric as stepping off that plane when you arrive on holiday

Now it’s time to update our clients, bank content, and relax.

By completing most of the hard work early, researching trending topics and getting results in the infancy of your outreach, you’re much more likely to reap the benefits.

Before you pack your bags and finish your reactive campaign at the peak of its success, consider what else you can do with it. Are there any additional exclusive comments you can provide now the topic is trending? Or has your extensive research resulted in inspiration for the next piece?

We need open-mindedness at the peak of our reactive PR journey, as this could open up doors for future opportunities.


Making it Unforgettable

Going home after a trip can be upsetting, but there are ways that you can make those memories last a lifetime, such as looking at old pictures, keeping in contact with friends you’ve made or even rebooking it all over again.

Similarly, rather than aimlessly writing reactive comments that are disposable, we must look at ways that we can utilise our successful strategy again.

To do this, we can review which publications and journalists covered our content, the timeliness of outreach and other quantifiable factors that made it successful. By sitting back and looking at the strategy, you’re much more likely to be able to replicate it another time and reduce wasted time for you and your client.

Plus, keeping reactive pieces linked to awareness days or seasonal events on file is a clever and time-efficient idea for when they come back next year.

So, are you ready to implement a structured reactive PR strategy and watch the results roll in? Get packing.


Your Essential Reactive PR Checklist

  • Awareness of events – Awareness Days, Seasonal Events, Trending topics etc.
  • Research tools – Twitter, Instagram, TikTok, News sites, Competitor Analysis etc.
  • A strong, tailored media list of relevant publications and journalists
  • Targeted and adaptable pitch
  • Coverage tracker
  • An open mind and plenty of creativity
  • Plenty of time to pivot and be reactive during a campaign!

The post Reactive PR: How Preparing for the Journey Will Help You Reach Your Destination appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/preparing-for-reactive-pr/feed/ 0
Screaming Frog & Brighton SEO Charity 5-a-side Football Tournament https://www.screamingfrog.co.uk/charity-football-tournament/ https://www.screamingfrog.co.uk/charity-football-tournament/#comments Tue, 19 Jul 2022 10:55:35 +0000 https://www.screamingfrog.co.uk/?p=204287 On Thursday, Screaming Frog teamed up with BrightonSEO to organise a charity football tournament at Power League Shoreditch. Over the course of the day, 16 teams battled it out in scorching conditions to win for their chosen charities (and pride!). The tournament came to a climactic conclusion with Distinctly winning...

The post Screaming Frog & Brighton SEO Charity 5-a-side Football Tournament appeared first on Screaming Frog.

]]>
On Thursday, Screaming Frog teamed up with BrightonSEO to organise a charity football tournament at Power League Shoreditch. Over the course of the day, 16 teams battled it out in scorching conditions to win for their chosen charities (and pride!).

The tournament came to a climactic conclusion with Distinctly winning the main trophy tournament, and The PHA Group winning the plate tournament.

In total, the tournament raised £2,770 for Distinctly’s charity of choice, Electric Umbrella.

Each team contributed £100 for their entry, while Novos and Blue Array also generously raised an additional £170 and £1,000 respectively.

Thanks to everyone for taking part and making it such a great day. We hope to make this an annual event, so there will definitely be some new contenders and rematches next time!

You can find a link to the online photo gallery here: https://adobe.ly/3Rv8RJ7

Event photos by Aaron James, Screaming Frog.

The post Screaming Frog & Brighton SEO Charity 5-a-side Football Tournament appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/charity-football-tournament/feed/ 1
Screaming Frog Log File Analyser Update – Version 5.0 https://www.screamingfrog.co.uk/log-file-analyser-5-0/ https://www.screamingfrog.co.uk/log-file-analyser-5-0/#comments Mon, 23 May 2022 10:36:16 +0000 https://www.screamingfrog.co.uk/?p=198261 We’re pleased to announce the release of the Screaming Frog Log File Analyser 5.0, codenamed ‘by the Sea’. If you’re not already familiar with the Log File Analyser tool, it allows you to upload your server log files, verify search engine bots, and get valuable insight into search bot behaviour...

The post Screaming Frog Log File Analyser Update – Version 5.0 appeared first on Screaming Frog.

]]>
We’re pleased to announce the release of the Screaming Frog Log File Analyser 5.0, codenamed ‘by the Sea’.

If you’re not already familiar with the Log File Analyser tool, it allows you to upload your server log files, verify search engine bots, and get valuable insight into search bot behaviour when crawling your website.

We’ve been busy working on heavily requested features and improvements. Here’s what’s new.


1) Updated Bot Verification

Search bot verification for Googlebot and Bingbot has been updated to use their public IP lists, which were kindly provided by the search engines, rather than performing a reverse DNS lookup. This means you can verify search bots almost instantly – saving loads of time.

Log File Analyser Bot Verification

Other search bots, such as Baidu or Yandex still go through reverse DNS verification, so remove these from the default selection of a new project to speed up the process further, if they are not required.


2) Import Custom Log Formats

The Log File Analyser has been upgraded to support a wider variety of log file formats automatically, and now provides the ability to view and customise fields to be used within the log file. While the LFA can automatically parse virtually all log formats, this is useful when log files are extremely customised, or a required field is missing.

You’re able to preview what’s in the log file and which log file components have been selected against which fields in the tool, and adjust if necessary. To enable this feature, select the ‘Show log fields configuration window on each import’ option under the ‘New Project’ configuration.

Import Custom Log Formats

You can view what log file components are used for each field, customise or debug. This is an advanced feature, and in general is intended for more complex use cases.


3) Import CSVs & TSVs

It can be a struggle to get hold of log files, and when you finally do – the log file data can be provided in random formats that are not raw access logs, such as CSVs or TSVs. Rather forcing you to try and get raw logs, you can now just upload these file types directly into the Log File Analyser.

Log File Analyser CSV & TSV Importing

Just drag and drop them into the interface in the same way you import log files and the LFA will automatically detect the log components, and upload the events for analysis.


4) Dark Mode

If analysing log files wasn’t cool enough alone (and you hadn’t already realised from all the screenshots above), you can now switch to dark mode in the Log File Analyser.

Log File Analyser Dark Mode

To switch to dark mode, just hit ‘Config > User Interface > Theme’ in the top menu.

Adjust to Dark Mode

5) URL Events over Time

The ‘URLs’ tab now has a lower window ‘Chart’ tab, which shows events over time in a graph when you click URLs.

Log File Analyser Chart Tab

This makes it easier to visualise crawling activity and trends of a page, than having to sift through the raw data in the ‘Events’ tab for each URL.


6) Exclude

Similar to the already released Include configuration, you’re now able to provide a list of regexes for URLs to exclude from importing to further help focus on the areas you’re interested in analysing.

Log File Analyser Exclude

7) Countries Tab

There’s a new ‘Countries’ tab, which shows data and a map of activity based upon the IPs from a log file.

Log File Analyser Countries Tab

This can be used in combination with the user-agent global filter to monitor whether a search engine is crawling from one specific location for example.


8) Apple Silicon & RPM for Fedora

We’ve introduced a native Apple Silicon version, for those using shiny M1 macs and an RPM for Fedora Linux users.

In limited internal testing, we found that the native Silicon version was up to twice as fast importing log files than the emulation through Rosetta.

These will be introduced for future versions of the SEO Spider as well.


Other Updates

Version 5.0 also includes a number of smaller updates, security and bug fixes, outlined below.

  • The LFA has been updated to Java 17.
  • JSON support has been significantly improved – As well as supporting one JSON blob per line logs, the LFA can also handle a single JSON blob that has an embedded array of log events. The LFA can now parse log files where a single JSON field or a CSV/TSV field is effectively a whole log line embedded. i.e. where an Apache log line is embedded as a single JSON value in a JSON log.

We hope the new update is useful.

If you’re looking for inspiration for log file analysis, then check out our guide on 22 ways to analyse log files for SEO.

As always, thanks to the SEO community for your continued support, feedback and feature requests. Please let us know if you experience any issues with version 5.0 of the Log File Analyser via our support.

Small Update – Version 5.1 Released 20th June 2022

We have just released a small update to version 5.1 of the Log File Analyser. This release is mainly bug fixes and small improvements –

  • Add support for CSV files containing JSON values.
  • Add restart button to Workspace configuration dialog.
  • Fix regresssion importing request lines that do not contain the HTTP version.
  • Fix issue with restart sometimes not working on Windows.
  • Fix crash showing advanced import dialog.
  • Fix crash exporting tab with no results.
  • Fix crash importing binary file.
  • Fix crash importing project from older versions.

Small Update – Version 5.2 Released 7th September 2022

We have just released a small update to version 5.2 of the Log File Analyser. This release is mainly bug fixes and small improvements –

  • Add support for importing TSV files with embedded JSON.
  • Add support for some new timestamp formats.
  • Fix issue preveting log files with uppercase protocol prefixes being imported.
  • Fix crash importing log with unsupported timestamp.
  • Fix crash analysing logs.
  • Fix double quotes in unable to import log file dialog.
  • Fix crash displaying context menu.

The post Screaming Frog Log File Analyser Update – Version 5.0 appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/log-file-analyser-5-0/feed/ 9
How to Use Keyword Mapping to Future-Proof Your Site Structure https://www.screamingfrog.co.uk/keyword-mapping-for-site-structure/ https://www.screamingfrog.co.uk/keyword-mapping-for-site-structure/#comments Mon, 09 May 2022 09:27:10 +0000 https://www.screamingfrog.co.uk/?p=195831 Remember road trips in the 90s when your dad would pull out a wrinkled old map to get from point A to point B? Today, we have maps on our phones, so we always have a sense of direction. Just as road maps help us navigate where we want to...

The post How to Use Keyword Mapping to Future-Proof Your Site Structure appeared first on Screaming Frog.

]]>
Remember road trips in the 90s when your dad would pull out a wrinkled old map to get from point A to point B?

Today, we have maps on our phones, so we always have a sense of direction.

Just as road maps help us navigate where we want to go, well-designed sitemaps help search engines crawl your website easily.

But how do you design a sitemap and build a site structure that would be a joy for search engines to crawl?

Keyword mapping!

Not only does keyword mapping help you implement an SEO-friendly site structure, but it also guides your content strategy in many ways:

  • It enables you to build out and scale content verticals.
  • It helps you build a site structure that provides an exceptional user experience and intuitive navigation for your readers.
  • It makes it easier for your marketing team to find growth opportunities, such as internal and external linking, content expansion, and conversion rate optimisation.

But before we jump to the ABCs of keyword mapping, let’s take a closer look at why keyword maps and site structures are essential for SEO.

This article is a guest contribution from Adriana Stein of AS Marketing.


Keyword Maps and Site Structures – Why Do They Matter?

Every home is different. Some are organised, while others are messy. The same goes for websites. Some have more “junk drawers” than others, making it more challenging to understand or help readers.

A sitemap is a remedy for chaos. Sitemaps help search engines find information on your website.

Now, there are two different types of sitemaps: HTML and XML.

  • XML sitemaps are manually submitted sitemaps that are primarily used for crawling and indexing your site.
  • HTML sitemaps are mainly to let users know how to use the information on your site.

Nowadays, XML is the main focus, as internal linking helps users find content on your site naturally.

Ultimately, a strong and orderly site structure gives your readers an enjoyable experience. And as a bonus, you can also get SERP site links that appear underneath your site on search engines to suggest pages that might interest your audience.

SERP site links are the result of good site structures.

Site links appear underneath your site on search engines when you have a clear site structure, which allows you to take up more real estate on the SERPs.

To recap, sitemaps help search engine bots crawl your website efficiently and optimise your crawling budget (the number of pages a search crawler can go through in a given time). And the way to ensure that your site is well-organised is by planning your content with keyword maps.

Let’s now go over how you can use keyword mapping to enhance your site structure.


How to Use Keyword Mapping to Enhance Your Site Structure and Overall SEO

My simple 5-step approach brings your site structure from disorganised and chaotic to neat and intentional. And by intentional, I mean it actually converts readers into customers.

Step 1: Audit Your Site and Review Your Current Sitemap

Let’s start from the top. To know where you’re headed, you must first perform a site audit (Screaming Frog is my preferred tool for this, as its crawling capabilities are truly unmatched). Next, review your site and see where most of the issues lie.

While carrying out this audit, ask yourself these questions:

  • Are your users able to easily find what they’re looking for?
  • Do they get stuck in a particular area? (Hint: If you’re unsure, I suggest installing a heatmap like Hotjar)
  • What do your bounce rate and time on site look like? (Hint: check Google Analytics for this data)
  • Is your content organised in a manner that’s easy to sift through and go from ToFu (Top of Funnel) pages to BoFu (Bottom of Funnel) pages?

Once you’ve completed your audit, it’s time to conduct keyword research!

Step 2: Conduct Keyword Research

Keyword research is the first step to keyword mapping. But for your focus keywords to be relevant, you must consider search intent.

Search intent is the thought process behind why someone is searching for a particular keyword or phrase. In fact, understanding search intent is so important to the basic functionality of Google that it was a crucial part of the latest algorithm updates. The biggest benefit of looking at search intent is so that you focus on BoFu searches, as in keywords that describe the exact product or service you’re selling.

Search Intent Mismatch

Let’s break it down: Users search specific keywords and phrases for multiple reasons – like learning more about a topic, taking action (download, buy, shop, etc.), visiting a store or event in person, or simply finding a company’s contact details.

As Google and other search engines have become more refined, they’ve integrated parts of language like semantics (the meaning of a word, phrase, or text) into their algorithm. Because of this, it’s critical to keep an eye out for high-demand terms on your current pages that aren’t associated with the intention of the pages— these are mismatched search intentions.

With users’ search intent in mind, pages in which mismatches occur typically come from:

  • Homepages
  • Highest-ranking pages
  • Category pages

An Example of How to Align Your SEO Campaign With Search Intent

Let’s look at some examples so you can better understand this concept.

Bruce types “best places to stay in Boracay, Philippines” into Google.

His search intent is to look for holiday stays, not long-term rentals.

If you’re a company running an apartment complex for long-term rent in Boracay, you shouldn’t try to rank for “best places to stay in Boracay, Philippines”. Because even if Bruce sees your website at the top of his search results, he won’t find your offer useful, and he’ll just click back to Google, hurting your SEO in the process.

A better approach would be to match your target audience’s search intent with keywords like “long term rentals in Boracay, Philippines” or “apartments in Boracay”.

As you can see, if you don’t consider search intent when doing keyword research, the chances are slim that you’ll find qualified leads and convert customers.

The bottom line: Satisfying search intent is Google’s primary goal – and should be yours, too.

Step 3: Create a Keyword Map

Okay, you’ve audited your site and conducted keyword research. It’s time to create a keyword map. With your keyword list, start grouping or “clustering” keywords according to search intent and possible content verticals.

Here are some examples of content verticals:

  • Product pages
  • Industry pages
  • Blog content
  • Case studies

For example, here are examples of keyword clusters grouped by content vertical for a company that sells a project management app:

Product Pages Industry Pages Blog Content/Resource Pages
  • Project management app
  • Best project management app
  • Team project management apps
  • CRM and project management app
  • Project management tool
  • Construction project management software
  • Healthcare project management
  • Marketing project management
  • Engineering project management
  • Project management skills
  • What does a project manager do
  • Project management phases
  • Project management process

Organising your keyword list into clusters is the key factor for setting up your website structure in a manner that aligns your content with the different levels of search intent your audience faces as they go through the sales funnel.

Then once you have keyword clusters, it’s time to build your keyword map in a spreadsheet or Google sheet. Here’s an example of such a keyword map:

A well-organised keyword map includes elements like the following:

  • The first three columns organise your pages into top and sub-pages – this is basically your site architecture (site structure).
  • The funnel column labels your content as ToFu, MoFu, and BoFu. This helps keep you mindful of satisfying your reader’s search intent.
  • The columns for the focus and related keywords are where you drop your keyword clusters. Organising them like this helps you develop and write your content according to on-page SEO best practices.
  • The URL column helps you find everything quickly and is particularly useful for internal linking purposes.

Alright, now that everything is organised, let’s tend to your site structure.

Step 4: Revise Site Structure

Based on the keyword map you just created, you can now adjust your site structure in a way that satisfies search engine crawlers and your users.

For example, if you used to have BoFu sales pages or MoFu case studies published as blog posts, you now know better to organise the sales pages under a “Products” content vertical and the case studies under a “Case Studies” content vertical.

As you update your page URLs and directories, make sure to use redirects so your users don’t end up stranded on broken links. As you continue to create new content, keeping your topic ideas organised in a keyword map helps you scale your content and your website in an organised way.

Step 5: Revisit and Update Keyword Map Quarterly

Now that you’ve done the initial legwork, all that’s left to do is refresh your keyword map regularly as you execute your content strategy.

Keep optimising and expanding your website to serve your audience with relevant content. As you create new content, check the keywords you already have in your keyword map. When you have a second look, you’ll often find that you need to make small tweaks to your content strategy.

Keyword maps can be so complex, especially for big websites that tackle multiple topics. So you need to review your keyword map regularly and keep it updated as you discover new topics and ideas to write about.

Here’s what I recommend to make the most of your keyword map:

  • Schedule an SEO audit update and content plan optimisation every 3-6 months.
  • Summarise the strengths, weaknesses, opportunities, and threats that come up during your audit.
  • Write down your proposed next steps and estimated budget.
  • Review keywords for each piece of content before writing it.
  • Track your content’s performance and make tweaks to the plan as needed.

Other Ways Keyword Mapping Enhances SEO

Keyword mapping is essential for providing your users with a great experience, but it also helps your SEO strategy in multiple ways:

  • It helps you develop and maintain a comprehensive internal linking strategy.
  • It informs your on-page and off-page SEO content strategy
  • It helps you avoid duplicate content
  • It makes the user search intent clearer
  • It ensures that you’re using keywords most strategically and effectively across all content

Keyword Mapping Is an Essential SEO Tool

Phew, that was jam-packed with information, so let’s recap, shall we?

Without a keyword map that defines your site structure, the user is confused, you’re confused, and search engines are confused.

But with a keyword map, it’s easier to target the right people, find growth opportunities, and scale your content production in the most efficient manner so that it brings in the highest amount of quality organic traffic. By basing your site structure on a well-designed keyword map, you’re truly establishing a future-proof website.

The post How to Use Keyword Mapping to Future-Proof Your Site Structure appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/keyword-mapping-for-site-structure/feed/ 15
Screaming Frog Scores a Hat-Trick at the UK Digital PR Awards https://www.screamingfrog.co.uk/screaming-frog-awards-hat-trick/ https://www.screamingfrog.co.uk/screaming-frog-awards-hat-trick/#comments Wed, 06 Apr 2022 11:45:22 +0000 https://www.screamingfrog.co.uk/?p=193658 Last night, members of Screaming Frog’s Digital PR and Content Marketing team attended the inaugural UK Digital PR Awards ceremony at London’s Marble Arch, where our campaigns were nominated in eight categories. Hosted by Don’t Panic Events, the live awards ceremony was an intimate affair, bringing together the best in...

The post Screaming Frog Scores a Hat-Trick at the UK Digital PR Awards appeared first on Screaming Frog.

]]>

Last night, members of Screaming Frog’s Digital PR and Content Marketing team attended the inaugural UK Digital PR Awards ceremony at London’s Marble Arch, where our campaigns were nominated in eight categories.

Hosted by Don’t Panic Events, the live awards ceremony was an intimate affair, bringing together the best in the business to celebrate some incredible digital PR campaigns from the past year.

After cocktails at the nearby Hard Rock Café and a tasty three-course meal at the Montcalm Marble Arch venue, the awards ceremony kicked off.

First up, Screaming Frog scooped the award for “DIGITAL PR CAMPAIGN OF THE YEAR – B2B” for its work with healthcare charity Nuffield Health, focused on workplace wellbeing.

Then, we won first prize for “DIGITAL PR LOW BUDGET CAMPAIGN OF THE YEAR” for our PR and content marketing campaign with online health and wellbeing retailer StressNoMore.

Finally, Screaming Frog was recognised for its integrated approach to Digital PR and Content Marketing for B2B client Rouge Media, winning the award for “BEST USE OF CONTENT IN A DIGITAL PR CAMPAIGN”.

On the night we also won a Silver award for the “DIGITAL PR CAMPAIGN OF THE YEAR – E-COMMERCE” category for our campaign with hair extension company Milk + Blush. Congratulations to Propellernet for winning Gold for their campaign with Pour Moi!

It was a fantastic evening and a great chance to meet new faces, and catch up with friends over a glass of wine…or three.

Event photo by London Photographer, Simon Callaghan Photography.

The post Screaming Frog Scores a Hat-Trick at the UK Digital PR Awards appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/screaming-frog-awards-hat-trick/feed/ 1
Version 100 of Chrome, Firefox & Edge May Break Websites https://www.screamingfrog.co.uk/version-100-of-chrome-firefox-edge/ https://www.screamingfrog.co.uk/version-100-of-chrome-firefox-edge/#comments Mon, 07 Mar 2022 09:32:12 +0000 https://www.screamingfrog.co.uk/?p=189418 Google Chrome, Mozilla Firefox and Microsoft Edge are rapidly closing in on a big milestone: version 100. While this sounds like a cause for celebration, it could result in endless headaches for a small number of websites, due to the bugs and compatibility issues that come with a triple-digit user...

The post Version 100 of Chrome, Firefox & Edge May Break Websites appeared first on Screaming Frog.

]]>
Google Chrome, Mozilla Firefox and Microsoft Edge are rapidly closing in on a big milestone: version 100. While this sounds like a cause for celebration, it could result in endless headaches for a small number of websites, due to the bugs and compatibility issues that come with a triple-digit user agent string.

Read on to find out why people are anticipating something akin to the Y2k bug all over again.


What Is a User-Agent?

To briefly summarise, a browser’s user-agent is a string (line of text) that helps identify which browser is being used, what version it is and what operating system is being used. It’s sent by the browser to the server through a HTTP header, and can be used for things like serving a mobile version of a site if the request is coming from a smartphone.

To give you an example, the current user-agents for Chrome, Firefox and Edge are:

  • Chrome: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
  • Firefox: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:96.0) Gecko/20100101 Firefox/96.0
  • Edge: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36 Edg/98.0.1108.62

Why Might Version 100 of Chrome, Firefox and Edge Break Websites?

The major concern with version 100 is due to the fact that the version number will be moving from two-digit to three-digit. The web uses various ways of handling and parsing user-agent strings, and similar issues were experienced when browser versions moved from single-digit to double-digit a little over 12 years ago.

In Mozilla‘s own words:

“Without a single specification to follow, different browsers have different formats for the User-Agent string, and site-specific User-Agent parsing. It’s possible that some parsing libraries may have hard-coded assumptions or bugs that don’t take into account three-digit major version numbers. Many libraries improved the parsing logic when browsers moved to two-digit version numbers, so hitting the three-digit milestone is expected to cause fewer problems. Mike Taylor, an engineer on the Chrome team, has done a survey of common UA parsing libraries which didn’t uncover any issues. Running Chrome experiments in the field has surfaced some issues, which are being worked on.”

Both Mozilla and Google have been testing this for some time, and there are backup strategies in place should the level of disruption be higher than anticipated. Mozilla and Google have both stated that if there are widespread issues, they can temporarily freeze major versions to 99 to avoid further problems.


When Are Version 100 of Chrome, Firefox and Edge Releasing?

Version 100 of the browsers are set to release on the following dates:

  • Chrome: March 29, 2022
  • Firefox: May 3, 2022
  • Edge: Week of 31-Mar-2022

They do however go through experimental and beta versions before they are rolled out on the stable channel, so these dates may change.


What Websites are Impacted?

It’s anticipated that only a very small number of websites are impacted, but the list of broken websites is growing each day. Notable names include Mirror.co.uk, Screwfix.com and Bethesda.net.

Out of interest, we bulk crawled over 3,000 domains from the Majestic Million, and didn’t find any other instances of broken websites.

The sites that aren’t compatible returned ‘403 Forbidden’ response codes, which would result in users receiving a error page when using version 100 of Firefox, Chrome or Edge. However, some websites may redirect users to an ‘unsupported browser’ page, so be sure to check redirecting URLs and their destination too.

For example, Standard Chartered’s Indian site is currently redirecting to an unsupported browser page advising users to upgrade their browser.

Further to this, Googlebot also regularly updates it’s user-agent string to match the latest stable version of Chrome. This means that if version 100 is resulting in your site returning a 403 Forbidden response code and Googlebot update, you could experience crawling and indexing issues.

While it is a small number of websites in the grand scheme of things, you definitely don’t want to be one of them.


How to Check if Your Website Is Impacted

If you’re wanting to check websites at scale, for example a list of your clients, the Screaming Frog SEO Spider can check these in seconds.

To do so you’ll first want to set a custom user-agent by going to Configuration > User-Agent > Custom (dropdown). The user-agents you can use to simulate version 100 are:

Firefox:
Mozilla/5.0 (Windows NT 10.0; rv:100.0) Gecko/20100101 Firefox/100.0

Chrome:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4650.4 Safari/537.36

Edge:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36 Edg/100.0

Once the user-agent has been set, switch to list mode (Mode > List) and you upload sites in bulk from a file or by pasting them in.

You’ll want to try all the user-agents to be sure, as at the moment some websites may load fine with Chrome v100, but not Firefox v100 and vice versa.

A domain returning a 403 Forbidden response code is a cause for concern, as it could mean it’s not compatible version 100 of Firefox, Chrome and Edge. Double check this using the browser themselves (information on how in the next section), as it could just be some kind of protection blocking the Spider Tool, such as Cloudflare.

As mentioned previously, be sure to thoroughly check any domains that are 30X redirecting, as some websites will redirect users to an unsupported or out of date browser warning page. To follow redirects in list mode, use the ‘Always Follow Redirects’ option under ‘Config > Spider > Advanced’.


Check a Website Using a Browser

If you just need to check a small number of websites, you don’t need to use the SEO Spider. Both Firefox and Chrome allow you to report the major version as 100:

For Chrome:
Go to chrome://flags/#force-major-version-to-100
Set the dropdown to ‘Enabled’

For Firefox:
Download and install Firefox Nightly
Open the settings and search for ‘Firefox 100’, and then enable ‘Firefox 100 User-Agent String’


Summary

We highly recommend spending some time checking both your own and your client’s websites, to ensure they wont be impacted by the imminent arrival of version 100 of major browsers. The aforementioned Github page has lots more information on the nitty gritty behind this issue, and allows you to also flag websites that are experiencing problems.

Big shout out to John Mueller for getting this on our radar!

The post Version 100 of Chrome, Firefox & Edge May Break Websites appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/version-100-of-chrome-firefox-edge/feed/ 3
4 Things I Learnt at the Women in Tech SEO Festival https://www.screamingfrog.co.uk/wtsfest-2022/ https://www.screamingfrog.co.uk/wtsfest-2022/#comments Tue, 01 Mar 2022 11:45:06 +0000 https://www.screamingfrog.co.uk/?p=189016 The conference was held in the Barbican Centre, with the breaks being held in the pretty garden room/conservatory, and the talks being held in the cinema on the bottom floor. Was there the option of taking the lift up the 5 floors between the two rooms? Yes. Did we decide...

The post 4 Things I Learnt at the Women in Tech SEO Festival appeared first on Screaming Frog.

]]>
The conference was held in the Barbican Centre, with the breaks being held in the pretty garden room/conservatory, and the talks being held in the cinema on the bottom floor. Was there the option of taking the lift up the 5 floors between the two rooms? Yes. Did we decide to use the lifts? No. We decided climbing the stairs was quicker than waiting, and good cardio!

With the promise of pastries and the fear of being too late to grab one, we made sure to get to the Barbican for just after 9. We signed in, grabbed all the free merch available, and had a quick look around the conservatory before the first talk. On the way we found a turtle sleeping by his pool. The rest of the day followed a similar pattern of an hour of talks with half an hour of breaks (with snacks at every opportunity).

Everyone we met at the conference was super friendly, and being new to the industry, I don’t know many people yet. Of course, coming from a well-known company certainly helps act as an icebreaker (and luckily no one asked me any tricky questions about the SEO Spider).


4 Things I Learnt

Analyse

Thanks to Lidia, Rejoice and Crystal for starting off the talks strong!

In the first section, I learnt about the possibility of repurposing old content, checking if content can be rewritten to target different keywords. Thinking of new ideas every month is one of the most tricky parts of SEO, so I’ll be sure to check what we’ve previously done and see if I can repurpose any of it.

Advance

Two more great talks from Roxana and Aleyda, unfortunately, Aleyda couldn’t be there in person but it was a great presentation nonetheless.

In the advance section, I learnt about writing reports, just in time for our monthly reports! It made me think about my client’s KPIs, how to write my reports to reflect them and to make sure I don’t include unnecessary and confusing data. (Roxana did a great talk about the internet, however it went completely over my head.)

Innovate

This was my favourite section of the day thanks to Paige, Lazarina and Miracle.

The overall theme of these talks was how to use machine learning to improve the speed and efficiency of certain SEO tasks, such as writing meta descriptions. Shout out to Miracle for her great talk on how to use ML to make websites more accessible. My main takeaway however was to start small, by searching for 10 minute ‘Machine Learning for SEO Beginners Guides’ on YouTube.

Empower

The final section of the day rounded off the conference well, Lou and Shannon were indeed empowering.

In this section, I learnt how to improve myself in the work environment, not only how to use my negatives as positives, but how to identify things I consider negatives. It made me think about my inner person, and make sure I allow myself to process and be honest with how I’m feeling.


Final Thoughts

Overall I really enjoyed WTSFest. The order of the talks was well thought out, and it went from simple concepts like how to write competitor analysis, to more complicated concepts such as how to use machine learning. It really inspired me to continue to grow.

I may be new to working in SEO, however, I have gone to other conferences and the quality of the talks was much better at Women in Tech SEO festival!

Mega thanks to Areej and the Women in Tech SEO community! Hope to see you at the next conference!

The post 4 Things I Learnt at the Women in Tech SEO Festival appeared first on Screaming Frog.

]]>
https://www.screamingfrog.co.uk/wtsfest-2022/feed/ 3