How to find high PA expired web 2.0s

SEO has been changing over the days and new techniques are replacing the old ones. Still, there are some things which still work and one of them is building quality expired web 2.0s.
Expired web 2.0s which had already established themselves with high authority can pass the link juice to your money site when you get backlink from them. When you dont have lot of budget, you can always look on scraping these yourself. In this post, I will discuss on how you can find the expired web 2.0s with Scrapebox and using other tools like Domain hunter gatherer.

Some web 2.0s dont allow you to register with already used names. Tumblr is very effective in this regard and you can easily register expired tumblr after scraping. The expired web 2.0s can be filled with quality content once you register them. Later, you can just link to your website and you get a high PA link to your blog!

Let’s see how damn easy the scraping process can be. 😀

Using Scrapebox:

Scrapebox is one of the best tools I have come across yet in the field of SEO. It is a must-have tool which comes in handy in a lot of ways. So, at first, you need to fire up your scrapebox console. You should be right at this page.

 Scrape Box Console

 

 

Follow the below steps to get going:

  1. Add custom footprint:

The first step would be to add custom footprints of what you need to scrape. Suppose, I need to scrape expired web 2.0s from tumblr, overblog, webs etc. The footprint would like below:site: tumblr.comsite: overblog.comsite: webs.comYou can add other web 2.0 platforms too similarly when you need others in the same format.      Custom Footprint

 

  1. Scrape the keywords:

Add your list of main keywords to the Keywords console.Copy them to the keyword console and hit “Scrape”. Once a scrape is complete, you can “clear left” and “transfer left” and click on “scrape” again to get a bigger list. You can go on scraping more and more till you reach the number you want. Once you are done, “Remove duplicates” and click on “Send to scrapebox”. Now, you should have your big list of keywords ready to use for harvesting. The bigger your keyword list, the more number of URLs you can harvest.

 

Scrape Keywords 

keyword scrape button

 

  1. Load proxies:

For scraping, you need to use private proxies. You can buy some at Buyproxies. I have been using them from sometime and they have very good support and you will never find any issues. I suggest going with 10-20 proxies at a start. proxies 

  1. Harvest URLs:

Once you are done with finding the list of keywords, you can start harvesting URLs. Click on “Start harvesting” as shown below:start harvesting 

 

READ  How can local businesses embed Google Maps on their website

  1. Process scraped URLs:

Once you have scraped your list of URLs, you will need to “Trim to root” and “Remove duplicates” to get the final list as shown below:Trim to root

 

Remove duplicate URLs 

 

  1. Check availability:

Once the scraped URL list is ready, you would need to check the availability of the URLs. You need to use “Scrapebox vanity name checker” addon for the same. Click on “Scrapebox vanity name checker”, the console would pop up. Click on the dropdown besides “Load” and then on “URLs from scrapebox”. Now, you can hit on “Start/Retry”. The name checker would check the availability of all the expired web 2.0s. After the check is complete, click on “Remove taken”. You can rerun and check the list multiple times as you should be confirmed about the availability of your final list. If you scrape more than 25000 URLs, you can easily get around 200-300 but you need to have more proxies or you may be locked out from Google. Adjust the scrapebox connections to 40-60. You could substantially slow your machine if you are running such a huge scrape. Its best to procure a VPS if you can afford.Vanity checker optionVanity checkrVanity checker remove takenVanity checker export to havester

 

Now, you can export the file as .txt and save it for further use.

Export harvested links

 

Scraping process 2(Domain Hunter Gatherer):

This is a paid tool which you can use to find expired web 2.0s if you have budget. Its pretty easy to use. You just need to have a big list of keywords which you can copy using keywordtool.io

Once you are ready with your list, you need to search “Copied keywords from clipboard” and you will get your list of expired web 2.0s ready after the scrape. The tools checks everything and gives you the final list. So, you save lot of manual work. You could save the time you need to perform Step 1 to Step 6.

 

  1. Find what you need(Additional bonus tip):

Now, check all the available proxies on a bulk Page rank checker and find the ones having min 20+ PA. Though Google doesnt update PR anymore, you could try checking for PR. Having some expired web 2.0s with good PR metrics is an indicator of its authority. A list of PR2+ sites can be worth registering. Do not get excited to see high DA as obviously, its a list of expired high DA web 2.0s.

  1. Manual check(Spam check):

This is the last step and can be crucial. Many times, people ignore this step but it is quite important. You need to spam check the URLs. Some expired ones may have been spammed to death previously and you can risk your money site by linking from them. To spam check, you need to use tools like Open site explorer or ahrefs. On open site explorer, search for backlinks of the URL and if you are lucky, you can find one with backlink from a authority site. If not, you can pick up the ones which are not spammed. Domains having adult links or casino links can be avoided. If you want to find the web 2.0s which are based on your niche, you can check them at Web Archive and see how they were used when they were active previously.As web 2.0s dont cost us for registering, its quite cheap to rank a site using them. Once you get the final list, go on to the particular web 2.0 site and register it.

  1. Adding content:

Once you register the web 2.0, create it like your own money site or blog. Add profile pictures, headers wherever necessary and start adding quality content to them. Write yourself or buy high quality content from reputed providers. The articles can be spun but they must be readable. Adding copied content won’t help you rank. The articles should be written on the same niche as your website and you should come up with interesting/catchy titles. I will come up with a post on How to optimize an article for SEO. You could also add spun content but make sure to edit them manually.Thus, expired web 2.0s could be a great source of ranking sites. After reading this post, you should be able to find them now.I would love to know your feedback in the comments section 🙂 

READ  SEO PowerSuite Review - An All In One SEO Tool
Amit Acharya

Amit Acharya



Hello, I am the author of Top Earning Strategies. I am an avid blogger and a digital marketer. I love to learn and write about making money online, SEO, Blogging and any new tips and tricks.
Amit Acharya

Get Latest Updates on SEO, Blogging and Marketing!

Get Latest Updates on SEO, Blogging and Marketing!

Its high time to get access to the awesomeness!

You have Successfully Subscribed!