Around 12 months ago I had a lightbulb moment when it comes to keyword research.
I devised a simple concept that would, in theory, allow me to find the most searched for keywords in any niche where Google are favourable towards ranking new websites and pages.
I started testing the concept on a small scale, using free tools, and the initial results were honestly incredible.
(I test a lot of ideas and this doesn’t happen very often).
I quickly scaled things up and automated the process, and that lightbulb moment ended up being one of the smartest things I have ever done in my thirteen years in SEO.
In this advanced keyword research guide I’m going to show you exactly what that idea was, how you can implement it today, and give three more advanced tactics I’ve never shared to go with it.
It’s great to be blogging again.
Strategy #1: A Powerful Keyword Research Concept That May Just Change Everything
I will never call myself an SEO guru but someone who deserves that title called this following tactic “brilliantly simple” so I hope you’ll feel the same.
The concept is this: If new websites are ranking highly for terms that get thousands of searches per month, those terms are likely easy to rank for with a new website of your own, or even easier to capitalise on with an established site.
I theorised that if sites under two years old are ranking well for terms worth targeting then I want to know what they are so I have a chance of ranking for them as well.
Young websites haven’t had as much time to establish links and authority, so the rankings are probably skewed towards “freshness” and on-site SEO.
Thanks to acting on this idea I found not only the most incredible keywords to target, but also the most incredible link sources and niches to enter as well.
I have never seen anyone else teach this, so hopefully it’s new for you as well.
If you want this data for yourself I’m going to share three ways to do that with:
- A free but incredibly slow method
- A quicker method with a small price tag
- A more expensive method which is incredibly fast
Let’s dive in…
The Free But Painfully Slow Way of Getting This Data
This free method is incredibly slow and there’s a good chance you’ll give up before you find anything valuable.
I’m just being honest.
I didn’t have money to spend on “messing around” when I started out online so I’m including this in the hope that it might help just one person.
To start with you’ll need a list of dozens or hundreds of keywords related to your niche that get between 500 and 5,000 searches per month.
I picked this range as most terms receiving less than 500 searches probably aren’t worth targeting, and those that get more than 5,000 are less likely to have ‘young’ sites ranking.
As this is an advanced keyword research guide, I’m not going to go in-depth on how to build an initial batch of keywords. There are a million beginner guides out there that will cover this for you.
Once you have the terms, simply head on over to Google and start searching for them.
If you’re unfamiliar with any of the sites ranking (e.g. they’re not huge brand names like Amazon), put them in Whois.com or any other WHOIS checker and see how old the domains are.
If the site is young and it’s ranking on the first two pages for your chosen keyword, awesome.
Use Ahrefs, SimilarWeb, SEMRush or similar and see how much traffic they’re getting overall and what other terms they’re ranking for.
It will be slow as I say, but if you’re on a budget you can do this all day every day without spending a penny.
Keep in mind it only takes one site to be a great ‘hit’ for you to find a lot of terms to target by looking into other queries they specifically are ranking for.
How to Speed Up the Process Dramatically for ~$150
Both parts of the free process are incredibly slow.
It’s slow to enter every query into Google and go through each search result, and it’s slow to take each site through a Whois checker to see how old it is.
To solve the first problem you have two great tools to help you.
One is Scrapebox, which retails for around $99, and the other is SERPScraper from URLProfiler which is completely free.
The latter looks like this when in use:
Both tools allow you to extract Google search results for as many queries as you like.
Note: This practice goes against Google’s Terms of Service. What you will be doing is 1/1,000th of the level that any rank tracker or similar service offers (and they profit from this data) but I do have to be responsible and give a heads up.
Something to keep in mind is that you’ll need to use proxies if you are going to do this on any kind of scale. If you aren’t using proxies, there’s a good chance Google will ban your IP, or your VPNs IP, and render you unable to do this work for a while.
I’ve never had a problem with BuyProxies.org who offer 50 semi-dedicated (shared) proxies for just $40 for 30 days.
Their prices are monthly but you don’t get tied into a contract. Payments are once-off when you need them.
How many proxies you need will depend on how many queries you are looking up and how quickly you want the results.
The more patient and ‘human’ you are, the fewer proxies you’ll need.
For the Whois side of things, you don’t have to spend any money as there are a lot of bulk Whois checkers which allow you to check up to 500 domains at a time.
They typically look something like this:
I’m hesitant to link to any because they frequently stop working.
Honestly, a simple Google search for “bulk whois checker” is going to be your best solution. I’m not being lazy; they just tended to break when I personally used them.
With this method you can easily check thousands of sites over the course of an hour or two.
It’s not incredibly fun, but as I said in the last section, you only really need a few hits (a very young site with a lot of top rankings) before you delve deeper in their keyword and backlink profiles and learn a ton in the process.
The Fastest Way to Do This With a $500+ Budget
If you’re a programmer or have someone on your team is a programmer, you may have a far better idea of how to tackle this concept than what I’m about to share.
This is just the way I went about things. If you have a better approach, act on that.
The first tool I used when looking to scale up was DataforSEO. They allow you to pull back the search results for as many search queries as you want without having to use Scrapebox, proxies and so on.
They handle all of that for you, and can do so at massive scale.
They can get pricey as you have to deposit a minimum of $500 to continue using the service after your free trial is over.
I don’t have any relationship with them and there are no affiliate links in this post, but they have incredible support and I had no reason to use any other offering.
The next tool I used was a combination of WhoAPI and something custom a programmer made for me (I don’t know exactly what, or why, as WhoAPI worked well).
They are incredibly cheap – starting at just $16/m for 6,000 Whois requests – and the founder, Goran, offers excellent support.
I’m not going to go into detail on how to link these together because anyone considering this option should either a) be able to figure it out very easily or b) expect to hire a programmer to do it for them.
Telling you how to use these tools would turn into a blog post teaching you how to program and that just isn’t feasible.
Put simply, the better your solution for combing through search results and verifying the age of the site ranking, the better you can scale this process and find even more valuable data.
The End Result: The Best Keywords, Backlinks & Niches That Exist Online
I spent weeks on this project initially and trawled through thousands upon thousands of search results, filling up Google Sheets with hand-picked sites that I would later delve into in massive detail.
I’m not exaggerating when I say I found the best keywords, backlink sources and niche ideas doing this than anything else I have ever done.
After all, the entire concept makes logical sense: If young sites are ranking for popular, intent-focused queries, those queries are worth looking at; as are the backlinks that helped them rank.
How the data appears will completely depend on how you acquire it, but here’s an example of how I formatted things:
I first started doing this in 2017 so domains registered in 2015 were / are still super relevant to me.
I should make one concession and say that using Whois is not a perfect way of finding young websites.
Let’s take my other blog, Gaps.com, as an example.
If we run Gaps through Whois, we see that the domain was registered back in 1995.
Even though I’ve been running the blog for less than two years, it would not be ‘pulled back’ using any of the methods above.
With this method you’re only going to find domains that were first registered in the last three years (or however long a timeframe you want to look at).
As I say, if you’re technically savvy or someone you work with has a better idea, go with that. I’m just sharing my own personal approach.
I’ve heard people throw out ideas of looking at website changes in Archive.org but honestly, I just didn’t need to get to that level.
I was able to find more data than I will ever be able to process or utilise myself, so I have no problem sharing this tactic with others who I hope can get value from it as well.
As a final note, I think to do things the right way, I wouldn’t make this data public.
When I initially shared this concept on a webinar a few months ago (shameless plug: join my newsletter if you want cool stuff first), a few people started thinking about all of the ways they could profit from this data.
Use it and benefit from it, but I personally wouldn’t reveal and ‘out’ the sites you’re trying to model and compete against.
I could have dedicated an entire blog post to this concept alone, but let’s keep going.
Strategy #2: The Google Custom Search Engine Method
One day I’ll talk about how I use Google’s Custom Search Engine for link building, but for now let’s look at how it can be applied to keyword research.
If you’ve never heard of it before, Google offers a tool where you can build your own search engine that allows you to search the entire web or just specific sites.
I often want to know what my direct competitors are up to when it comes to content marketing and SEO and custom search engines make that a lot easier.
With a Custom Search Engine (CSE), I can search only the specific sites that I judge to be my competition in various niches.
For this first example I’m going to take some of the world’s best design blogs from our own rankings and add them to custom search engine.
With a simple intitle: search we can see which keywords they’re focused on for month based search queries, which tend to include fresher content ideas you may want to mimic.
It turns out targeting design trends with monthly search queries is a thing, and something I would probably replicate if I was in the industry myself.
I can also see that wallpapers are searched for with queries that change each month as well. Another angle I might want to target.
(Later I’ll show how ever-changing these search results are, and how easy it is to rank for similar terms.)
If you have a big enough list of sites in your CSE you can also see who is already targeting search terms for 2019, for example.
You would be surprised how many sites are already focused on search queries from the future.
Here’s good topic idea that could apply to most niches: Ranking for discussions about conferences in the space.
Depending on the sites you are tracking and the niche you’re in, you can also get great insights into direct competitors by using queries like:
- intitle:giveaway – Great for seeing what audiences respond to
- updated on – Combine with date sorting to find old content they’re updating
- last modified – A variation on the above, depending on the sites in your niche
- “we interviewed” – To find people who are open to talking to multiple sites in your space
If you get creative with this, you can use terms that would be messy looking through the entire internet for but be incredibly lucrative when you’re only searching through your competitors’ websites.
I promised this would be an advanced guide so let’s kick things up a notch.
Something I don’t think many people know is that you can actually set your own sorting options in Google CSE, at least to a degree.
If you head to Search Features > Advanced, you’ll see the following:
What this allows you to do is sort by keys specified in PageMaps, rich snippets markup or metatags.
Here’s a snippet from Google’s own documentation on the feature:
For example, if your content is marked up with Reviews rich snippets, you could enable sorting by rating (number of stars) or other numerical key. If your content includes a PageMap identifying a publication, you can sort results by Publication Date.
Emphasis my own.
Inspired by Google’s own example, I set-up a key to sort pages by the number of reviews they had received.
Perfect if you’re in a niche where people are likely to rate things and you want to know which pages to build out first or which products you may also want to promote.
Here’s what the key looks like:
Using it, searching any of the sites I have added to my CSE and entering custom search terms, I can get back results like this:
Pretty cool, no?
Keep in mind that you can use a Custom Search Engine to search the entire web. You don’t have to pick sites to include.
The keys can apply to most schema markup; just make sure you have included enough sites in your CSE (if you’re adding them manually) for the results to be useful.
Other keys I’ve tried – which work on certain sites – include:
I’m still finding new uses for this myself so if you figure out anything super useful please do leave a comment and let me know. I would love to hear how you’re using it!
Strategy #3: Model Keyphrases Other Sites Have Ranked for Within Weeks
The first strategy in this guide was all about finding young sites so you can possibly mimic what has made them rank so well, so quickly.
We don’t just have to apply this theory to young sites; we can also do the same for young pages.
You will need premium tools to have the most fun with this – such as Ahrefs or BuzzSumo – but I’ll also show you a ‘hacky’ way to do this if you don’t have any money to spend.
The Free Way to Find Fast Ranking Pages
To begin, you’re going to need to pick out some of the biggest sites in your space which frequently publish content.
Don’t pick a site like Business Insider or Huffington Post that cover every topic, but rather an authority for a particular niche.
In personal finance that would be the likes of NerdWallet and Value Penguin.
Once you’ve picked some big names, head on over to Google and perform a “site:” search while changing the date range to find articles that were published between three and eight weeks ago.
You don’t really want anything younger than three weeks as Google have a tendency to rank new articles highly before they ‘drop’ to their rightful position.
On the other hand, going past eight weeks means articles have had more time to pick up links which might make competing a little more difficult.
You can see an example of this in action below using NerdWallet as the site to investigate.
Even though the date associated with the first article is recent and NerdWallet rank well for related queries, this wouldn’t be a good example to dig in to.
Well, it wasn’t actually published in 2018.
Thanks to Archive.org we can see it was published back in 2017, which kind of defeats the point of this experiment.
If we keep going through the search results we can find more examples, such as this article on vinyl siding.
I couldn’t find it in Archive.org, and as a cool little trick, I also opened one of the images see when it was uploaded by its URL.
This works on most WordPress sites.
The post states it was published in July of 2018, and images uploaded into the article match that based on their URL structure, so there’s a great likelihood that’s when it first went live.
We can then search for queries we think the article has a chance of ranking for, and help verify our suspicions using the free keyword data from a Chrome extension like Keywords Everywhere.
Without spending any money, it’s fair to say there’s a strong chance that this recently published page is ranking well.
I can confirm that’s the case with premium tools.
This process isn’t particularly fun, and it will take time, but it is a free tactic you can add to your arsenal.
If you’re worried that the sites are ranking so well because of non-freshness factors (like links) you can always do these checks on ‘lesser’ sites in your industry.
The Premium, Faster Alternative
If you have access to premium tools, you can make your life much easier by using the likes of Ahrefs or BuzzSumo.
Ahrefs’ Content Explorer is perfectly built to take advantage of this angle.
Not only will it allow you to find recently published articles by site, but you can get a great estimate about how much search traffic they’re pulling in and what terms they’re ranking for.
(Disclosure: I worked with Ahrefs on a once-off project this year but have been a paying customer – and recommended their tools – for many years before doing anything together).
BuzzSumo is probably the closest competitor – you can see when particular content was published – but you will only get their link and share counts, rather than organic traffic stats.
Typically I would mention SEMRush or Moz when I cover Ahrefs but I don’t believe either have any content-related features.
In Ahrefs you can see I have performed a site-search for Smashing Magazine, chosen the date range the articles should have been published in and then sorted them by those that get the most search traffic.
We can see that Smashing Magazine published an article at the end of July which is already pulling in hundreds of visitors due to search results for “august desktop wallpaper” showing fresher results.
You can do this for any site in any niche, and is a huge timesaver if you can afford the tool.
For example, you can see how quickly LinkedIn started ranking for their definition of B2B marketing:
Or which new content is performing well for PCWorld:
You can get a ton of ideas from this tactic alone so take your time and have some fun with it.
If you do have an Ahrefs account already, enjoy checking the recently published article from Medium.com (using the same dates above) that’s getting the most search traffic.
Strategy #4: Reverse-Engineer ‘Weak’ Sites Which Probably Shouldn’t Rank
Have you ever found yourself clicking on a search result and thinking, “This is a terrible website”?
Try to stop yourself in your tracks next time. You might have stumbled upon a great site to compete with.
This next strategy plays on the idea that some sites rank well but probably shouldn’t, or likely won’t one day, so you can use what’s working for them to your advantage.
You may have your own idea of what a poor search result is, but I often find terms to target by specifically looking into forums.
Forums can be fantastic resources, and if you count StackOverflow as one then some will be next to impossible to outrank, but there are a lot of dated and ‘thin’ results out there which are fairly easy to compete against.
I won’t insult a particular site so rather let me focus on what I think are poor search results by starting with my favourite niche, the makeup space.
One of the most popular sites in the industry is MakeupTalk, and here are some of their top rankings:
Typically, a forum would be the perfect destination for a question like “is ipsy worth it” as you can get lots of opinions in one place.
That said, there haven’t been any updates on the the Makeup Talk thread in four years. That’s a really long time for an offering to have improved or gotten worse.
Surely new content would be far more valuable to a searcher.
Though the second keyword example has been discussed a bit more recently (two years ago), the majority of feedback in the thread is from 2007.
The first response to whether shampoo expires is literally “Not sure…would see if there is a date.”
There are millions of forums out there so you really can use this for any niche.
Another option is to find out what is ranking well in specific Reddit subReddits.
R/Succulents helped me find this intent-focused keyword with a good amount of search traffic:
There are “only” 15 comments on the page, which is pretty low for Reddit. The first is asking the same question for Canada. The second is deleted.
I could comment on them all, but I’ll get to the point: Not a single comment answers the searchers query. Everyone is just asking for the same thing.
Their Today I Learned section also ranks for some really interesting queries, no matter what niche you’re in:
With millions of subReddits, RedditList is a great resource to get an idea of sections of the site to look into.
As I was putting together this article, my friend Dan tweeted a great example of this in relation to local content.
He argues that when Yelp ranks well, you may be able to outrank them with actual content, rather than just a list of places that offer that actual thing.
Which sites do you constantly see in search results that you don’t really think provide the best search result.
I don’t mind going up against the big guns if I think I can produce something dramatically better.
On Quora especially, I constantly find search results where the first few answers are nothing but people promoting their own websites.
If you’re on our SEO secrets newsletter (opens a pop-up) you would have learned in secret #3 exactly how to extract the top quora pages in any niche (and their exact view counts).
Bonus: Two More Advanced Tactics
A few weeks ago I went live with a brand new Detailed homepage where I’m more focused on building a newsletter (takes you to our homepage opt-in) around this website.
I’ve just went live with secret #4 where I reveal two more advanced keyword research tactics I have never seen shared anywhere else.
One of them comes to us thanks to Kinsta, a 7-figure WordPress hosting company, who were kind enough to share actual traffic screenshots from a tactic that is bringing them paying customers.
It’s really smart, and it has been getting some great feedback.
If you have any questions or feedback, please do let me know in the comments!