In this post, you’ll learn how I ranked a window cleaners website to the top of the search results in just under 2 months for local-based keywords. As a result of this, he saw a 1000% ROI and such an increase in business that he had to stop the SEO service.
That’s right, he saw such a significant increase in the number of leads that his site was generating that he had to put an end to the service because he couldn’t handle anymore. Impressive stuff, I was quite chuffed with the outcome, so was he.
In this post you’ll learn what happens when you implement gorilla SEO tactics to a small window cleaners site, resulting in the site ranking 1st for every ‘window cleaning’ phrase for his local town + every other town within a 10-mile radius of his location.
Expanding on what I mean by every ‘window cleaning phrase’ this should give you an idea:
- window cleaning [location]
- window cleaner [location]
- window cleaner in [location]
- best window cleaner [location]
- cheap window cleaner [location]
- affordable window cleaner [location]
- and a whole lot more phrases, he ranked for everything related to window cleaning.
It’s fun working on local-SEO projects, as the competition is minimal and you see a significant increase in their organic revenue due to the implementation(s) you have conducted to their site. It feels good + I like to help other people, why not?
That’s crazy, how did you do it?
It was an easy job, in fact, I was budgeted to work on this project for just only 5 hours PER month. I went a little over the budget, but for a good cause. This was only achievable because the competitiveness for local window cleaning keywords was near-to-none, his site was competing against static HTML sites that had only 6 pages indexed in Google. I received this brief when I worked agency-side and oh my, I was keen to get going as soon as possible.
I would name the business, but as I no longer work agency-side, I am still bound by the exit contract to not release any client names without their written consent. Before I proceed with how I did this, and what I did exactly, here were the OSE/Majestic metrics before getting started:
- Domain Registered: 17-Jan-2013;
- Domain Authority: 9;
- Page Authority: 21;
- Trust Flow: 10;
- Citation Flow: 31;
- External Backlinks: 28
- Referring Domains: 6
As you can see from the above metrics, there isn’t anything special about this domain. It’s not that authoritative but at least it’s been around for a couple of years and it’s not sitting in Google’s sandbox. I noticed one spammy .xyz domain that was pointing a couple links to their site in this process. I knew I could have some fun with this domain.
Phase 1/9) Check the search volume for possible search queries
I’m not a fan of keyword research for local based keywords, as the chances are the search volume will be near-to-none but I thought I’d give it a try before proceeding with the on-page stuff. Plus, the client wanted to know *roughly* how many people were searching for window cleaning services around his area, and if people were searching for generic terms like ‘window cleaners’ or for specific services like the following:
- Conservatory cleaning;
- Gutter cleaning;
- Solar panel cleaning;
- Skylight cleaning;
- Fascia cleaning.
After tapping away on SEMrush and other tools to find some generic and long-tail keyword variations, it turns out that the majority of people search more generically, the search volume wasn’t huge but I knew there was an opportunity here for the business to grow if they were dominating the SERP’s. He wanted to be known as the number #1 window cleaner in his local town, plus all of the surrounding areas. He’s got the workforce but his website just wasn’t cutting it for him, as his van was primarily generating all of the leads (he has his phone number, company name, and services printed on his van). Therefore, we decided to primarily focus on the more generic phrases like ‘window cleaners [location] and other abbreviations.
Phase 2/9) Eyeball on their competitor’s website and link profile
This part was fun and it only took a couple of minutes, then I realized that it wouldn’t take me to long to get this site ranking top. As you could imagine ..
Here is what one of their competitors link profiles looked like (and they were ranking in top three):
That’s a lot of links for very few domains, I’m thinking something malicious happened here. Hopefully, now Penguin is real-time it’s ignoring most of those links, as it looks like some site-wide linking activity is happening.
After some digging, it looks like the domain kassy-kan.net is linking to them over 14,000 times. The site isn’t relevant to their niche at all.
Phase 3/9) Determining what needs to be done from a technical perspective
As I am effectively changing the appearance of a client’s site, I need to be aware of what can and can’t be changed moving forward. Therefore, I decided to plan an initial meeting with the client to discuss possible implementations and changes to the structure of their site to better Google’s understanding of what niche they are looking to target, search queries and other things. This wasn’t a 2 minute job as I had to mess around with the HTML/CSS aspect of the site, however, I ensured that I implemented the fundamentals:
- Removed all of the internal CSS included in the header section and placed it into its only external sheet;
- Converted all PNG image files that were above 700kb in size and converted them to jpegs;
- Added spammy, irrelevant .xyz domain to the disavow file;
- Removed the meta keywords tag from the site’s header and all of the keywords contained within;
- Robots.txt check, checking the site from Googlebot’s perspective, sitemap installation;
- H1 tag at the top of the page including primary keywords & top 4 locations they are targeting;
- Change homepage title to include primary keyword, phone number and locations;
- Implement H2 & H3 tag(s), including ‘window cleaners’ in the H2, but not the H3 to avoid over-optimisation;
- Applying ALT tags to better Google’s understanding of their images, I used screaming frog to identify these;
- Adjusted content to include different keyword variations;
- Adjusted their client testimonials on the homepage when necessary to try and include relevant search terms;
- Included social sharing functionality on the homepage;
- Increased the size of their phone number in the header section;
- In the footer section, I included an ‘Areas We Clover’, adding a list of locations they serve;
- Included a link to their sitemap in the footer;
- At the very bottom of the footer, I included a paragraph outlining most of the locations they serve;
- Included their range of services in the footer .. ‘Conservatory Cleaning’, ‘Gutter Cleaning’ etc;
- lots of other implementations + boldening a few keywords for both SEO & UX;
The homepage was predominantly optimised to appear for the location of which their business operated in, whilst covering a few other local towns.
Phase 4/9) Integrating Google Analytics & Webmaster Tools
Of course, in order to benchmark our implementations against agreed KPI’s, we needed some software in place to be able to do so. As GA & WMT are Google’s products, and they *probably* favor sites using their own inventions, we decided to go down this route, plus, it’s free. On the other hand, just as a reminder: it’s always nice to get some visibility of your site outside of Google, if you have the budget, I’d highly recommend investing in another visitor tracking tool also.
This involved adding the analytics tracking code to the site, verifying the webmaster tools property, adding sitemaps, setting preferred URL’s etc. As their site was hosted on a .com domain, it was important to set up geo targeting, as we don’t want the site to be listed in other search engines like Google.com, as it’s a local business and will ever only serve those in a 10 mile radius.
Phase 5/9) Build a Deadly Web 2.0 Private Blog Network
If you’re an SEO yourself, then you probably would’ve heard of the tiered link building strategy. It’s quite old, but it still works, especially for keywords that aren’t competitive (like the requirements of this project). This strategy involves registering freebie Tumblr and WordPress sites that are hosted on the back of the official domain. Something like this: seoisawesome.wordpress.com. By default, because this site is a subdomain of wordpress.com, it carries a domain authority of 85+, however, a very low page authority (PA). Therefore, you’ll need to build links to this site to increase it’s PA (It’s relatively easy as Google thinks the site is trustworthy because it’s on the back of an authoritative domain). Once you’ve built *some* links and waited for those links to get indexed, then you have a good site that you can link from to your money site (the site you’re looking to rank) and the link will carry *some* weight.
I’m not going to discuss this in too much depth, as there are lots of articles on the subject already. However, because I was aware that my client’s competitors link profile wasn’t that powerful, I could get away with such a simple tactic to rank the site.
There are three ways you can approach this strategy:
- Manually create these web 2.0 sites and add 3, 650-word articles onto each of the sites, then proceed with blasting hundreds of social bookmarks, blog comments, forum profiles, and other web 2.0 sites to build its authority;
- Find expired web 2.0 accounts that SEO’s have previously created / and or genuine people. These domains have expired and therefore are available to register. Upon registering, you’ll inherit all of the external sites that are linking to that sub domain. Therefore you don’t have to spend time building bookmarks, blog comments, forum profiles etc;
- Pay a virtual assistant to scrape and register the expired web 2.0’s for you.
I took route 2, this is easily do-able if you have the time, however, you will need a license of Scrapebox, plus, a decent footprint list, comment, and proxy list in order to pull some decent results.
You can read more about how to build a web 2.0 network here. Once I got together and registered around 6 expired Tumblr’s all of which had a PA of 30+, plus 3 expired WordPress domains and Weebly, I then started to populate all of these with decent content. After a couple hours on the Redbull, I got together 3, 650-word articles for each, these articles weren’t the highest of quality but they contained images, videos and such to make them look genuine.
Please be aware that you MUST register each of the web 2.0 sites with a different IP address, if you try this – otherwise they may get deleted in the near future.
Once all of the web 2.0’s were populated, I then linked to my money site (in this case the window cleaners site) with various combinations of anchor text:
- Partial-match anchors (learn more about window cleaning);
- Long anchors (I loved using this window cleaning service);
- Exact match anchors (window cleaning);
- Branded anchor text (company name);
- Naked link (full URL);
- Generic anchor (go here, click here).
Phase 6/9) Grabbing some cheeky links from directory sites
Even though Google have specifically stated that directories aren’t the right way to do link building, it’s always good to be included in *some* decent directories, as it’s quite natural for your local business to be included in a few.
Most importantly, you’re able to obtain a do-follow link from the Yell directory. Once you have created a free business listing and you have access, you’re able to populate the heading field with HTML. Therefore you can manually put an ‘a href’ link into that field and it will render as a do-follow. This may get patched in the future, but not many people know about this. Below is an image of a random site I found that is taking advantage of this.
Make sure you don’t include an address in the website URL field, otherwise a no-follow link will also be displayed on your listing (Google may read the no-follow before the do-follow, and then ignore the more important link).
I then found a local directory site that allows HTML entries in the business title, therefore I included yet another do-follow link to the window cleaners site using a branded anchor. The directory then referenced that link throughout the entire site every time it referenced my business title (once clicked it goes to the listing about that business). This may get patched sometime soon.
I used Ultimate Demon to enroll the site into some relevant directories in the local area.
Phase 7/9) The internal link structure of domination
After implementing all of the previous tactics, I then waited patiently for a couple of weeks to avoid getting the site a penalty. I slowly started to see the site’s ranking and authority increasing, due to all of the additional links that the site has got (you may want to drip feed those links over time) from web 2.0’s and high authoritative sites.
As you could imagine, Google’s starting to consider our site to be much better than the static HTML site that has over 15,000 spammy domains. Therefore, it only took a couple of weeks to surpass them and take that first spot. Brilliant, what about all of the other towns?
In order to then re-focus our efforts on the other towns and to create clearly-defined landing pages to target each one, it’s important to raise the authority of the root domain before doing so, otherwise, your landing pages probably won’t stand a chance in the SERP’s, even though the competition is very minimal. To get started, I got together a list of the surrounding towns/villages within a 10-mile radius of their location, Wikipedia helped me here. You’re able to search by county, which will then display all of the towns in that specific area, pretty neat.
Once I’ve got the locations, I then created a separate page called /areas-we-cover/ that I used to list all of the locations by bullet points. I used a 3 column layout here to make it look somewhat presentable, even though it’s only intention is to get Googlebot to visit the landing pages and to pass juice. Once I listed all the pages, I then added links (even though the landing pages aren’t created, I still know the structure). If the location was called ‘Clandown’, I would create a link from that anchor text to the following page: /clandown/. I decided not to include any keywords in the slug to avoid over-optimisation, as I was planning on creating A LOT of pages. Below is an image of another site that is effectively using this technique:
If you’re going to try this, it’s worth while writing *some* content to put on the page, like the above image. Otherwise, if the page is just full of links, it may look dodgy from Googlebot’s perspective. Quick recap: If you’re unaware, Google lifted the 100 links per page limit a couple years ago. Google used to take action on sites that had over 100 links on a singular page, that isn’t the case now. If the links provide meaning to the user, and it doesn’t come across spammy, you should be in good shape. Matt Cutts talks about this in more depth in the video below:
Phase 8/9 ) Creating a landing page template using PHP
This part is probably the trickiest out of them all and will require some effort. As the site was running on the WordPress CMS, it’s relatively easy to create a landing page template. You can learn more about how to create a WordPress template here.
In order to start the process, I replicated the original site template and then built on it. I kept the same header and footer but changed the body content(s). I implemented some divs that had custom styling and pretty much built placeholders on the page for when I do add content. It’s important that the page template is unique, and doesn’t look like the rest of the site too much, as I was planning on building a lot of landing pages. Therefore, in order to reduce the possibility of Google not indexing our pages, they needed to be somewhat unique. The page template included all of the basic fundamentals, h1, h2, h3 tag, big paragraph section for lots of content, image section etc. Below is a rough example of the body section of the template:
I then got to work and wrote some content in Word. I wrote the content as if I was specifically writing about a certain location, therefore including words like ‘services in clandown’, ‘best window cleaner in clandown’ etc. I even optimized the heading tags and ALT tags to include locations. However, instead of referencing a specific location, I would use PHP to render the WordPress page title, by using: < ?php wp_title(); ? > (without spaces between ? and >).
Below are some examples:
- window cleaner in < ?php the_title(); ? >
- The best window cleaner in < ?php wp_title(); ? >
- Are you located in < ?php wp_title(); ? >
Just so you’re aware, the SEO page title and wp_title are completely different things. The wp_title is the actual name of the page that’s set when creating a new page.
The reasoning behind why I decided to take this approach, is so, when I create a new page that is specifically called ‘Bath’, the parameters will automatically populate ‘Bath’ in the areas that I set. Therefore it will look like this:
- window cleaner in Bath
- The best window cleaner in Bath
- Are you located in Bath?
Instantly, the page is now optimized for the Bath area. It’s quite cheeky, as you can include the wp_title parameter in heading tags, paragraph tags, SEO page titles, strong tags and even ALT tags. By default if the page is named ‘Bath’ it’s slug will be www.website.com/bath/ – which fits in perfectly with our /areas-we-cover/ page.
Please be advised that I replicated phase #8 multiple times, I created 10 different templates, all of which had unique content, structure, and positioning. It’s important that you have some unique content, otherwise, Google won’t bother indexing them.
Phase 9/9) Automating the landing page creation process
So, once I had my landing pages in place, it was just a case of creating the pages. As my plan was to create 50 landing pages for 50 different locations, I wasn’t to keen on manually having to go through and click ‘Add New’ in the WordPress CMS, 50 different times. I decided to use a CSV upload plugin instead, therefore I could simply populate a spreadsheet, upload it and watch the magic happen.
In order to use the CSV importer effectively, you’ll have to get to grips of the column names and values that WordPress use in the database because you’ll be effectively telling WordPress via the spreadsheet what you’d like your pages to be called, what template it should use, SEO pages titles etc. I have listed some of them below:
- post_author: (log in or ID) The user name or user ID number of the author;
- post_date: (string) The time of publishing date;
- post_title: (string) The title of the post;
- post_status: (‘draft’ or ‘publish’ or ‘pending’ or ‘future’ or ‘private’ or custom registered status) The status of the post. ‘draft’ is the default;
- post_name: (string) The slug of the post;
- _wp_page_template: (page template) The landing page template I created;
- post_type: (‘post’ or ‘page’ or any other post type name) (required) The post type slug, not labels.
Once I had a spreadsheet in place outlining the above details, I then split the spreadsheet up into 5 different CSV files. As we don’t want to request all 50 pages to be created at once, instead, I uploaded one CSV file a day for 5 days. Each file contained 10 URL’s of which was split up between 2 landing page templates, therefore I had one template in place for every 5 locations.
Once the landing pages were created, I then included a direct link to the /areas-we-cover/ page in the footer. Therefore, every page on the site is passing link juice through to each individual landing page via the internal link structure + I uploaded the newly created landing page pages to One Hour Indexing to ensure that they were indexed sometime soon.
This strategy only worked because the nearby towns weren’t competitive at all, so, having a well-optimised page with lots of content still has a decent chance of ranking, even without any links. Below are some of the landing pages that came up, when the ‘site:’ command is executed in Google search for the domain. This doesn’t include the other 45 locations.