Scraping my own list and GSA Threads
2014-07-08, 08:05 PM,
(This post was last modified: 2014-07-08, 08:07 PM by luckskywalker.)
#1
Hey guy, sorry for a lot of question :P

1.
I have GScraper and Scrapebox, what did you prefer to scrape your list for GSA?

It's possible to "combine" them and use together to create a better list?

2.
Did you set your GSA threads per proxies or per machine? I have a dedicated with 16gb ram so that is not a problem, but I just have 10 semi-dedicated proxies from buyproxies.
Reply
2014-07-08, 08:08 PM,
(This post was last modified: 2014-07-08, 08:09 PM by Krrish.)
#2
I already told you Gscraper.... Is this not enough :p

You can run 300 thread easily with that type of machine. Make sure timed out is set to 150-180

[Image: ytsrcpae_500x90.jpg]
Reply
2014-07-08, 08:27 PM,
(This post was last modified: 2014-07-08, 08:27 PM by amar125.)
#3
The more proxies, specs, the more threads you can run
For Proxies point of view, following threads recommended
1 Proxy >> 10 Threads

For Specs, you need to check and see how many threads your VPS can handle without lagging etc !!
Reply
2014-07-08, 08:31 PM,
#4
(2014-07-08, 08:27 PM)amar125 Wrote: The more proxies, specs, the more threads you can run
For Proxies point of view, following threads recommended
1 Proxy >> 10 Threads

For Specs, you need to check and see how many threads your VPS can handle without lagging etc !!

Hm, I will need to invest more in proxies then :P

Thanks
Reply
2014-07-10, 04:22 AM,
#5
Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.
Reply
2014-07-10, 04:56 AM,
#6
(2014-07-10, 04:22 AM)Silverdot Wrote: Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.

But even with low connections is not better to use the private ones?
Reply
2014-07-10, 05:52 AM,
(This post was last modified: 2014-07-10, 05:53 AM by Silverdot.)
#7
(2014-07-10, 04:56 AM)Neros Wrote:
(2014-07-10, 04:22 AM)Silverdot Wrote: Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.

But even with low connections is not better to use the private ones?

You missed this part:

" Also the more specific your footprints are for scraping the less Google with allow you to search. "

No. No it's not.

If you want to fight me on it and do your own thing then go ahead. But when you start seeing 402 error codes and not returning any URL's from your scraping section feel free to come back to this thread and tell me I was right.
Reply
2014-07-10, 06:01 AM,
#8
(2014-07-10, 05:52 AM)Silverdot Wrote:
(2014-07-10, 04:56 AM)Neros Wrote:
(2014-07-10, 04:22 AM)Silverdot Wrote: Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.

But even with low connections is not better to use the private ones?

You missed this part:

" Also the more specific your footprints are for scraping the less Google with allow you to search. "

No. No it's not.

If you want to fight me on it and do your own thing then go ahead. But when you start seeing 402 error codes and not returning any URL's from your scraping section feel free to come back to this thread and tell me I was right.

I'm just asking, calm down lol
Reply


Possibly Related Threads...
Thread Author Replies Views Last Post
  [HELP] How to export a list of users who commented on an instagram post? SeoFreak 3 151 2017-10-02, 10:07 PM
Last Post: sudirbhai
  List Of Most Popular Websites + SEO Bonus Watt 27 856 2017-10-01, 08:51 PM
Last Post: Watt
  Kontent Machine VS Wicked Article Creator ,WHICH IS BEST TO RANK FOR GSA SER? trafficbeast 7 2,050 2017-09-14, 07:28 PM
Last Post: fionix
  FOLLOWLIKER scraping hasorand0m 5 291 2017-08-09, 06:25 AM
Last Post: hasorand0m
  ★ CapsLuck's Helpful Threads ★ CapsLuck 14 2,264 2017-04-30, 01:01 PM
Last Post: mugilmoli




About Us | Contact Us | CPA Elites | Advertise | Stats | Staff Team

© 2013-2017 CPA Elites Ltd
Enhanced by MyBB and WallBB
Return to top