Friday, May 25, 2018

The 12 most important elements of a technical SEO audit

Contrary to popular belief, technical SEO isn’t too challenging once you get the basics down; you may even be using a few of these tactics and not know it.

However, it is important to know that your site probably has some type of technical issue. “There are no perfect websites without any room for improvement,” Elena Terenteva of SEMrush explained. “Hundreds and even thousands of issues might appear on your website.”

For example, over 80% of websites examined had 4xx broken link errors, according to a 2017 SEMrush study, and more than 65% of sites had duplicate content.

Ultimately, you want your website to rank better, get better traffic, and net more conversions. Technical SEO is all about fixing errors to make that happen. Here are 12 technical SEO elements to check for maximum site optimization.

1. Identify crawl errors with a crawl report

One of the first things to do is run a crawl report for your site. A crawl report, or site audit, will provide insight into some of your site’s errors.

 

You will see your most pressing technical SEO issues, such as duplicate content, low page speed, or missing H1/H2 tags.

You can automate site audits using a variety of tools and work through the list of errors or warnings created by the crawl. This is a task you should work through on a monthly basis to keep your site clean of errors and as optimized as possible.

2. Check HTTPS status codes

Switching to HTTPS is a must because search engines and users will not have access to your site if you still have HTTP URLs. They will get 4xx and 5xx HTTP status codes instead of your content.

A Ranking Factors Study conducted by SEMrush found that HTTPS now is a very strong ranking factor and can impact your site’s rankings.

 

Make sure you switch over, and when you do, use this checklist to ensure a seamless migration.

Next, you need to look for other status code errors. Your site crawl report gives you a list of URL errors, including 404 errors. You can also get a list from the Google Search Console, which includes a detailed breakdown of potential errors. Make sure your Google Search Console error list is always empty, and that you fix errors as soon as they arise.

Finally, make sure the SSL certificate is correct. You can use SEMrush’s site audit tool to get a report.

 

3. Check XML sitemap status

The XML sitemap serves as a map for Google and other search engine crawlers. It essentially helps the crawlers find your website pages, thus ranking them accordingly.

You should ensure your site’s XML sitemap meets a few key guidelines:

  • Make sure your sitemap is formatted properly in an XML document
  • Ensure it follows XML sitemap protocol
  • Have all updated pages of your site in the sitemap
  • Submit the Sitemap to your Google Search Console.

How do you submit your XML Sitemap to Google?

You can submit your XML sitemap to Google via the Google Search Console Sitemaps tool. You can also insert the sitemap (i.e. https://ift.tt/1yk2Rmx) anywhere in your robots.txt file.

Make sure your XML Sitemap is pristine, with all the URLs returning 200 status codes and proper canonicals. You do not want to waste valuable crawl budget on duplicate or broken pages.

4. Check site load time

Your site’s load time is another important technical SEO metric to check. According to the technical SEO error report via SEMrush, over 23% of sites have slow page load times.

Site speed is all about user experience and can affect other key metrics that search engines use for ranking, such as bounce rate and time on page.

To find your site’s load time you can use Google’s PageSpeed Insights tool. Simply enter your site URL and let Google do the rest.

You’ll even get site load time metrics for mobile.

This has become increasingly important after Google’s roll out of mobile-first indexing. Ideally, your page load time should be less than 3 seconds. If it is more for either mobile or desktop, it is time to start tweaking elements of your site to decrease site load time for better rankings.

5. Ensure your site is mobile-friendly

Your site must be mobile-friendly to improve technical SEO and search engine rankings. This is a pretty easy SEO element to check using Google’s Mobile-Friendly Test: just enter your site and get valuable insights on the mobile state of your website.

You can even submit your results to Google to let them know how your site performs.

A few mobile-friendly solutions include:

6. Audit for keyword cannibalization

Keyword cannibalization can cause confusion among search engines. For example, if you have two pages in keyword competition, Google will need to decide which page is best.

“Consequently, each page has a lower CTR, diminished authority, and lower conversion rates than one consolidated page will have,” Aleh Barysevich of Search Engine Journal explained.

One of the most common keyword cannibalization pitfalls is to optimize home page and subpage for the same keywords, which is common in local SEO. Use Google Search Console’s Performance report to look for pages that are competing for the same keywords. Use the filter to see which pages have the same keywords in the URL, or search by keyword to see how many pages are ranking for those same keywords.

 

In this example, notice that there are many pages on the same site with the same exact keyword. It might be ideal to consolidate a few of these pages, where possible, to avoid keyword cannibalization.

7. Check your site’s robots.txt file

If you notice that all of your pages aren’t indexed, the first place to look is your robots.txt file.

 

There are sometimes occasions when site owners will accidentally block pages from search engine crawling. This makes auditing your robots.txt file a must.

When examining your robots.txt file, you should look for “Disallow: /”

This tells search engines not to crawl a page on your site, or maybe even your entire website.  Make sure none of your relevant pages are being accidentally disallowed in your robots.txt file.

8. Perform a Google site search

On the topic of search engine indexing, there is an easy way to check how well Google is indexing your website. In Google search type in “site:yourwebsite.com”:

It will show you all pages indexed by Google, which you can use as a reference. A word of caution, however: if your site is not on the top of the list, you may have a Google penalty on your hands, or you’re blocking your site from being indexed.

9. Check for duplicate metadata

This technical SEO faux pas is very common for ecommerce sites and large sites with hundreds to thousands of pages. In fact, nearly 54% of websites have duplicate metadata, also known as meta descriptions, and approximately 63% have missing meta descriptions altogether.

Duplicate meta descriptions occur when similar products or pages simply have content copied and pasted into the meta descriptions field.

A detailed SEO audit or a crawl report will alert you to meta description issues. It may take some time to get unique descriptions in place, but it is worth it.

10. Meta description length

While you are checking all your meta descriptions for duplicate content errors, you can also optimize them by ensuring they are the correct length. This is not a major ranking factor, but it is a technical SEO tactic that can improve your CTR in SERPs.

Recent changes to meta description length increased the 160 character count to 320 characters. This gives you plenty of space to add keywords, product specs, location (for local SEO), and other key elements.

11. Check for site-wide duplicate content

Duplicate content in meta-descriptions is not the only duplicate content you need to be on the lookout for when it comes to technical SEO. Almost 66% of websites have duplicate content issues.

Copyscape is a great tool to find duplicate content on the internet. You can also use Screaming Frog, Site Bulb or SEMrush to identify duplication.

Once you have your list, it is simply a matter of running through the pages and changing the content to avoid duplication.

12. Check for broken links

Any type of broken link is bad for your SEO; it can waste crawl budget, create a bad user experience, and lead to lower rankings. This makes identifying and fixing broken links on your website important.

One way in which to find broken links is to check your crawl report. This will give you a detailed view of each URL that has broken links.

You can also use DrLinkCheck.com to look broken links. You simply enter your site’s URL and wait for the report to be generated.

Summary

There are a number of technical SEO elements you can check during your next SEO audit. From XML Sitemaps to duplicate content, being proactive about optimization on-page and off is a must.

Thursday, May 24, 2018

From being too broad to being too lazy: three common PPC fails 

PPC and search marketing are both vital to a company’s success. So, it’s amazing to see the mistakes that so many brands still make today. AdWords has added tools like upgraded URLs to make it a little easier to manage campaigns.

But glaring errors still happen – and frequently. While these mistakes can seem small – especially if a brand has a big SEM budget – each one can have a significant impact on an advertiser’s reputation and ROI. Here are a few of the most common PPC mistakes search marketers make, and some methods to address them.

Your search terms are too high-level

A common mistake for many first-time (and even experienced) search advertisers is that they start out too broadly. For example, if you’re an electrician in Boston starting out AdWords for the first time, you don’t want to go in big on the mains terms such as [electrician] and rely on city geo-targeting. Instead, be selective about your target keywords and build campaigns around specific terms such as [Electrician Arlington], or [24 callout Brookline Electrician].

The same rule applies to different verticals, including for example retail. It can be costly to start driving traffic on the term [dresses] if you’re a retailer. However, terms like [size 12 red dress] would have a higher propensity to convert. Start with these terms, then start adding more terms that could be higher up in the funnel for more awareness.

This process will install more discipline into how you measure the individual ROI of your range of keywords and bring scale when running on AdWords.

If you’re about to make the leap into broad expensive generics, then why not just target these keywords with RLSA only to build retargeting lists. It’s a more conservative step than going full throttle to make an impact in that auction.

Lazy ad management

Lazy ad copy is a big no-no in paid search. And using the same copy for all sponsored listings should be banned. Tailored ad copy offers the best way to get clicks and conversions, boosting ROI and generally making a much bigger impact than if a brand used the same ad copy for every keyword they were targeting. 

Brands should always add context to their ad copy, and changing the wording for specific ads allow them to do that. For example, if a cruise line offers ads for an all-inclusive trip, they also need to add something tailored to the copy, like the customer perks, to make users want to click. For trips that aren’t all-inclusive and are designed for families with children, the cruise line should change the copy to appeal to those looking for the best deals for kids or for example entertainment. 

Speaking of lazy ad management, in these two ads, the brands somehow forgot to change the auto prompt in their ad setup. This means they’re not only targeting the string “add your keywords here,” but they’ve also set the ad to autofill the headline based on the keywords. This results in a silly ad that’s unlikely to get any clicks.

Now, this could be a simple oversight, since the prompt text sometimes fails to disappear when you start typing your keywords into the box. However, marketers need to check their targeted keywords on an ongoing basis to achieve truly successful PPC management.

In addition, as Google continues to push for more hands-off automation in the AdWords workflow — through features like Dynamic Search Ads — it’s important to keep a close eye on what’s going on. These suggestions may help you automate, but they might not be the best fits to meet campaign goals and could actually hurt your standing if they don’t take the actions of your competitors into account. So advertisers, stay vigilant.

Not mastering your seasonal spikes in demand

If you don’t know your seasonal trends inside out, then there’s a very good chance you’ll be left behind in auctions and miss the spike in demand.

AdWords and analytics allow you to get into the weeds as far as time of day, or week that’s driving impressions. You must be ready to react to these trends, but still within your target margin for a good ROI.

For some big brands, plans around mastering Black Friday peak periods start around three months before the event. A great deal of planning goes into the price, product and type of promotion for this XX day period. As auctions become increasingly competitive, it’s vital that you have a strategy to win, too.

To get the most out of your seasonal spikes, you need to master all match types, segment your RLSA lists, increase bid modifiers by device and day/part bidding where possible and opt into DSA to fill in any gaps you may have missed.

The common theme here is a lack of attention. It’s important that PPC advertisers always monitor campaigns to ensure that they don’t make the same mistakes seen here. This means a more thoughtful strategy, with the right tools – safeguards in AdWords, third-party technology to support by monitoring for errors and mistakes and beyond – in place on the back-end to safeguard.

 

An introduction to innovation in consumer search optimization

It may be obvious statement, but over the last 15 years the internet has completely transformed the retail industry. The once flourishing high-street is declining, as more and more consumers are swapping the shopping experience for the convenience and added choice that online retail offers. Mobile technology means more people are purchasing products via apps while on-the-go, rather than popping out on a Saturday morning to browse the isles and rails in-store.

The digital retail space has seen a huge number of disruptive innovations over the years, from artificial intelligence (AI) offering tailored recommendations to smart chatbots transforming and streamlining customer service and the new additions of drone deliveries and augmented reality – it’s an exciting time. Despite these leaps, most search technologies still used by modern retailers and brands are lagging behind. It’s these search engines that are the next element of the retail sphere set to have an innovation makeover.

Search engines today

When it comes to current search engines there are some big issues with accuracy. A Spoon Guru scan of the food search landscape revealed that the majority of leading supermarkets around the globe – including Walmart in America, Sainsbury’s in the UK and Carrefour in France – fail to return accurate search results for common dietary search terms, such as gluten free, low sugar or vegan. These needs seem simple, so imagine the difficulty of finding the right food if you have more complex requirements or multiple preferences.

Similarly, Google – the biggest search engine on the planet – came up short when looking for specific requirements. An analysis of the first page of results on a Google shopping search for vegan sausages found a staggering 19 of the 40 products are not vegan. In fact, two even contained pork.

 

So, what do you cook when you have a vegan, coeliac and nut allergy sufferer coming over for dinner? It isn’t the start of a bad joke as I’m sure you suspect, but a real-life scenario.

The grocery market (the industry in which Spoon Guru’s technology currently sits) is of course an area where there should be no margin for error, as for those with serious allergies and intolerances the consequences of a mistake can be fatal. However, no matter what the search enquiry – from homeware to sports equipment and clothing – these days consumers deserve, and ultimately should be able to get correct results that perfectly match their personal requirements.

The next generation of search technology

Despite a wealth of available products, people are still finding it challenging to find what they need owing to unstructured metadata and unconnected databases. Being innovation-led is crucial for retailers and brands who want to survive the digital boom, and responsive changes are required to match the shift in consumer expectations. Consumers want a personalized, seamless sand consistent experience. So how do you optimize search capabilities to match this?

A combination of AI, machine learning and human expertise, powers the next generation of search technology: Customer Search Optimisation (CSO). The secret to the CSO system is tagging and natural language processing. Natural language is the biggest problem facing machine learning, as it is presented with imperfect data owing to the ambiguities and variables within it.

Spoon Guru’s TAGS technology breaks down this language and translates it into labels that can then be assigned and organized. Over 24 hours, the system analyses 14 billion data tags, 2.5 million statements and over 16 million words, classifying over 180 preferences within the grocery retail space. CSO literally crunches billions of data points every day, opening up the market to match more products than ever before to specific consumer requirements.

Another important part of CSO is the human-in-the-loop system. Incorporating expertise – in Spoon Guru’s case, nutritional expertise – with the algorithms means that any inconsistencies, conflicts or erroneous classifications can be resolved. It also means that the latest scientific knowledge continues to be integrated into the technology’s DNA.

Tesco have already adopted the TAGS technology on their digital shopping platforms – via desktop and the app, helping shoppers find more products to match dietary needs, from the simple to the very complex.

CSO not only provides a better service for users and consumers, but by appearing in specific online searches, it will help boost brands’ profiles by providing further visibility, as well as becoming a core revenue driver.

Future-gazing

Currently TAGS technology and CSO is transforming the grocery industry, as it is an area where specific requirements are becoming more of a necessity to the consumer – 64% of the world’s population are now on some kind of exclusion diet. With Tesco, one of the UK’s largest retailers, on board we can expect the technology to become a set standard across the industry.

Eventually this technology can be expanded and modified to work across different sectors, from entertainment, fashion, sports, hospitality, events, and pets. The possibilities of this transformative technology are pretty wide.

By leveraging smart technology (and smart people) we can cater for the modern multi-preference consumer, providing much more accuracy, relevance and choice.

Wednesday, May 23, 2018

When to just say “no” to bidding on brand

Brand bidding is often touted as a hotly debated topic in paid media. That said, a cursory glance will reveal that the typical advice given by PPC experts is to bid on it, without exception. But the question is whether this is good advice or just self-serving.  Let’s look at each of the main arguments for brand bidding in more detail and see if the answer is a bit more nuanced and not just black and white.

 

  • Protect your brand

If you have competitors appearing for brand terms, either in paid or organic then yes, you need to consider how you protect this traffic stream and PPC brand bidding is an obvious option, along with improving your SEO rankings. On the flip side, if you have limited or no competition, it really is worth considering pausing branded PPC.

Logic would dictate that if a user is searching specifically for your brand, then they want to visit your site or engage with content. If your SERPs don’t contain competitors and only links associated with properties you own, then the benefits of bidding on brand terms become dubious.

Even if you are convinced that you will lose some brand traffic in this scenario, you need to ask yourself if the incremental value that PPC provides is worth it. Let’s say you lose 5% of brand traffic by turning off PPC brand. What that’s telling you is that 95% of your PPC brand traffic did nothing.

In other words, because you would have to appear on 100% of those impressions to protect your brand loses, a £10,000 brand spend which usually would get a £100,000 return, would actually have an incremental return of just £5,000 When viewed this way your ROI goes from 9.00 to -0.5.

 

  • Dominate the SERPs

Dominating the SERPs is a desirable goal for a lot of advertisers, however, this argument falters if you are already dominating the SERPs without a paid ad hogging the top spot. If you don’t have paid competitors bidding on your brand terms and you have good organic rankings for your owned properties you already dominate the SERPs and are just paying to push all your other listings down the page.

If you do have some organic competitors creeping onto your first page results, you should look to manage your other properties outside of your main site. This is a good way to ensure that you capture as many organic listings as possible and push any SEO competitors of the first page longer term. Ensure that your social media sites, Wikipedia page, and local listing are optimized and appearing for brand terms.

 

  • Control your messaging

It’s often argued that PPC is better at controlling message and landing page choice. While it is indeed more agile at both, you still have control from an SEO point of view. SEO’s have been optimizing listing copy in the meta descriptions since SEO began and any half decent SEO team will have categorized your pages and actively optimized your brand terms to land on the most appropriate pages.

Potentially if you have a sale running or have launched a new range, you may want to quickly reflect this in your copy. But again, if you have no competitors, it’s logical that you will get the click anyway and you can use your landing page to convey any important messaging.

 

  • Your brand terms are cheap

Brand terms often are cheaper than non-brand terms, but unless they add benefits, it is just an additional and unnecessary cost. You could be using that spend on other new customer driving activity to grow your business.

 

  • Capture high-quality traffic near the point of conversion

PPC managers and teams love to bid on brands because it converts well, makes reports look great and can mask poor performing activity.

This obviously isn’t a good enough reason on its own and only applies if you must defend from losing conversions against competitors. Otherwise, you are just taking credit for conversions you already would have received.

 

  • Brand terms improve overall account quality score

The premise here is that brand bidding will improve overall quality score and therefore decrease cost per clicks on other terms (at least in the early stages when you launch new keywords and they are yet to establish a QS). However, Google has never admitted that account level quality score exists. They have chopped and changed QS measurement over the years, all new keywords used to launch with a QS of 6, now they start at zero and build as they accrue data. According to Google, keywords are assessed on their own merit and gain a quality score once they build data. PPC experts have asserted that account level QS exists, but we have little hard evidence to support that claim, so the known benefit is wishy-washy at best.

 

My view is that brand bidding should be avoided if possible. A PPC manager will only add value by growing prospecting activity, not piggybacking off the success of a brand name.

If you can turn off brand, ensure that you break your brand terms into different categories and assess the need to bid on them independently. A core brand term [brand x] may have no competition, but a brand + product term [brand x shoes] may be more likely to receive competition as other advertisers broad match on the product term (rather than directly bidding on your brand). Also, add negatives as well as pausing keywords and continue to monitor the situation, in case competitors appear.

If you must bid on brand, then task your PPC manager with making it work for you as efficiently as possible, getting that traffic cheaply. If it’s within your power, take steps to remove the need to bid on a brand. Boost SEO rankings on other owned properties, like Facebook and Wikipedia, reach out to resellers and even competitors about brand bidding, you may be able to reach an agreement to stop bidding on each other’s terms.

Do remember to assess your situation on a case by case basis and understand if brand bidding is right for you.

 

Tuesday, May 22, 2018

A review of the payday loans algorithm in 2018

For several years, the search term ‘payday loans’ has regularly attracted more than 200,000 searches per month on Google.co.uk. Whether providing loans or generating leads, the payday loans industry has notoriously been big business and at its peak, was estimated to be worth around £2 billion per year.

Because of this, the top positions on Google’s SERPs for ‘payday loans’ have been a hugely lucrative and sought-after search term; and subsequently was dominated by SEO professionals using massive manipulation to hack their way to the top of the search results.

Until 2013, page one for payday loans barely listed a real payday loan company. Instead, the listings were made up of ‘hacked sites’ including bicycle sales, women’s magazine and frankly, just random domain names that once clicked on redirected to a dubious data capture form.

 

 

Introducing the payday loans algorithm

With customer data at risk and a mountain of complaints from UK consumers (and similar results in the US), Google reacted and introduced an official “payday loans algorithm” in June 2013. For the search giant to acknowledge a particular search term – demanding its own algorithm and focusing on a micro-industry across the pond – it was certainly out of the ordinary and we are yet to see any other industry treated in the same regard.

The payday loan algorithm update was rolled out over a two-month period. The first payday loan update occurred in June 2013, followed by Payday 2.0 on 16 May 2014 and Payday 3.0 which was rolled out shortly thereafter in June 2014.

Whilst the first algorithm change was a general clean up, payday loans algorithm 2.0 focused on targeting spammy queries, abusing Google+ accounts, doorway and hacked websites. Payday loans 3.0 was geared towards tackling spamming links including links of low quality, reciprocal links, forums, blog networks and websites which require paid submissions in exchange for a link.

Soon after the rollout of Payday 3.0, the search results were essentially cleaned up and have since been a much clearer representation of how rankings for payday loans should be by showing legitimate companies.

Those websites that were targeted by changes in the algorithm were subsequently penalized from Google searches, which included dropping 10 pages or even off the face of Google altogether. There were a handful of sites that had previously dominated the SERPs and then ceased to maintain any online real estate including Tide U Over and Red Wallet.

 

Bringing payday to today

The payday loans business took another drastic change following the introduction of FCA regulation in January 2015. Whilst the industry remains lucrative, the number of companies’ active has diminished significantly in the last three years – from 200 lenders to around 40 and originally hundreds of comparison sites to around a dozen. Margins have been hit by the introduction of a price cap, keeping the daily interest at a maximum of 0.8% and tougher regulation on the selling of data – leading to much higher operating costs and barriers to entry.

While there have not been any additional releases of the payday loans algorithm, Google is still keeping an eye on it and even implemented a ban on PPC ads for payday loans in 2016. The outcome was far stricter in the US than in the UK where lenders and comparison sites can still show paid ads but are required to show proof of their regulatory license to Google before going live.

 

How to successfully rank for payday loans in 2018

Fast forward to 2018 and there are 10 legitimate companies ranking in the top 10 for ‘payday loans’ in the organic search on Google.co.uk.

Our SEO company has successfully ranked five of the websites that are currently positioned in the top 10 and based on the success we have seen, we have identified some of the main trends below, which seem to be very specific to a payday loans algorithm and differ to the techniques used for ranking for other keywords in loans and insurance.

 

Direct lenders win over comparison websites: All websites positioned in 1 to 10 are essential providers of payday loans, known as ‘direct lenders’ and not comparison websites. While the main comparison sites in the UK dominate the search results for things like life insurance, car insurance and personal loans, none of these companies come near the top 3 pages for ‘payday loans’ despite all having a landing page to target this keyword.

In positions 1 to 20, there is only one comparison website that features all the lenders and we are responsible for their SEO. However, their homepage resembles a more direct lender with a calculator and apply now button versus a comparison table format.

Brands win over exact match or partial match domains: There is no website listed in the top 10 that has the word ‘payday’ in their domain, suggesting that Google prefers to see brands over exact match or partial match domains. Compare this to other industries where logbookloans.co.uk ranks first for ‘logbook loans’ and two companies ranking on page one for ‘bridging loans’ that include the main keyword in their domain name.

Keeping in line with the brand theme, sites that rank well will have quality traffic from several sources including direct, paid, social and email. To benefit their SEO, the users should have high engagement rates, high average time on site and low bounce rates. This can be hugely beneficial for search rankings but is not an isolating factor. Companies such as Sunny and Lending Stream advertise heavily on TV and will generate good direct traffic as a result, but their lower search rankings do not correlate with enhanced direct traffic.

Domain age less relevant: Whilst several industries such as car insurance use the age of the domain as an important ranking factor, this seems to be less relevant for payday loans. Notably, 3 of the top 5 that rank (Cashfloat, Drafty and StepStone Credit) are less than two years old. This could be attributed to accumulating less spam and a history of low-quality links compared to much older domains.

Links still win… domains with more links tend to outrank those with fewer links. Interestingly, around 7 of the top 10 seem to have similar domains linking to them, suggesting there are some links that Google clearly values in this industry. However, finding the balance here is key as some of these similar links have a very low DA and spammy link history. Understanding which will work well will be the difference between better search positions or a penalty.

Strong user experience: A strong UX making it clear where to apply for a payday loan is proving to be more effective than providing thousands of words explaining what payday loans are. Keeping in line with user intent, successful websites are making use of calculators, images and videos to drive the application and not provide thin content.

Room for alternatives: Two sites currently in the top 5 for payday loans are offering alternatives (StepStone Credit and Drafty.) This could highlight Google’s moral obligation to offer a variety of products and not just high-cost short-term loans, thus alluding to whether they are in fact manually organizing the SERPs themselves.

 

To conclude, the usual SEO techniques of brand building, link acquisition and good user experience still apply to rank well in a modern payday loans algorithm. However, there is no doubt that payday loans in 2018 still requires a very specific approach; which can be achieved by looking at the sites that rank successfully and getting a feel of what content they write and what links they get.

In an ideal scenario, we should see MoneyAdviceService ranking top of the tree since it has the most authority and has numerous links from every single payday loans company in the UK – but as they sit on page 3 and have for some time, this is proof that the beast of ranking for payday loans surely has a mind of its own.