Find residential proxy sessions on Apify Proxy with target IP addresses geo-located in specific postal codes or DMAs.
This actor finds residential IP address on Apify Proxy that are geolocated in specific postal codes or DMA areas.
The actor probes random sessions on Apify Proxy with the RESIDENTIAL
proxy group
and using IP geolocation checks if the corresponding residential IP address belongs
to a certain postal code or DMA area, in a specific country.
If yes, the actor saves the session key and then performs periodic requests
on that session to keep it alive.
Therefore, the actor needs to run infinitely or as long as you need the proxies.
Yes, this actor is a hack.
The pool of residential proxy session is periodically stored as a JSON record into a Key-value store (either to a named or an anonymous one), including various statistics. The file looks as follows:
1{ 2 "stats": { 3 "probesTotal": 1290, 4 "probesMatched": 672, 5 "probesDmaMismatch": 409, 6 "probesDmaNotFound": 86, 7 "refreshesTotal": 4688, 8 "refreshesIpSame": 4197, 9 "forgotten": 289, 10 "probesFailed": 3, 11 "refreshesFailed": 16, 12 "refreshesIpChanged": 319, 13 "probesNoPostalCode": 25 14 }, 15 "proxySessions": { 16 "596452102": { 17 "ipAddress": "1.2.3.4", 18 "countryCode": "US", 19 "regionName": "New York", 20 "city": "Yonkers", 21 "postalCode": "10701", 22 "dmaCode": "501", 23 "foundAt": "2019-09-11T11:32:47.727Z", 24 "lastCheckedAt": "2019-09-11T11:33:27.487Z" 25 }, 26 "dbc0a42d7": { 27 "ipAddress": "4.5.6.7", 28 "countryCode": "US", 29 "regionName": "Maryland", 30 "city": "Severn", 31 "postalCode": "21144", 32 "dmaCode": "512", 33 "foundAt": "2019-09-11T11:32:08.278Z", 34 "lastCheckedAt": "2019-09-11T11:33:27.325Z" 35 }, 36 ... 37 } 38}
Yes, if you're scraping publicly available data for personal or internal use. Always review Websute's Terms of Service before large-scale use or redistribution.
No. This is a no-code tool — just enter a job title, location, and run the scraper directly from your dashboard or Apify actor page.
It extracts job titles, companies, salaries (if available), descriptions, locations, and post dates. You can export all of it to Excel or JSON.
Yes, you can scrape multiple pages and refine by job title, location, keyword, or more depending on the input settings you use.
You can use the Try Now button on this page to go to the scraper. You’ll be guided to input a search term and get structured results. No setup needed!