Ethereum Account Transaction API

Ethereum Account Transaction API

Retrieve detailed Ethereum transaction history for any wallet address, including sender, receiver, amount, status, and fees. Supports pagination for large datasets. Ideal for investors, analysts, and developers tracking ETH transactions.

DEVELOPER_TOOLSECOMMERCEOTHERApify

Ethereum Account Transactions API

Features

  • Scrapes Transaction Data: Collects transaction details from Ethereum for a specified Ethereum address.
  • Pagination Support: Handles multiple pages of transaction data with customizable page size and start page.
  • Error Handling: Manages exceptions and logs errors during the scraping process.

Example Usage

Here's how you might set up your actor:

1{
2  "address": "0xYourEthereumAddressHere", // Must be a valid Ethereum address.
3  "maxItems": 500,                        // Must be between 1 and 2000.
4  "startPage": 2                          // Must be between 1 and 1000.
5}

Example Output

1[
2    {
3        "tx_hash": "0xea8aa9285cbf2e116d52026040002fb95290571817cb44a81b5727e245b839d8",
4        "status": "Success",
5        "method": "Transfer",
6        "blockno": "21730011",
7        "date_time": "2025-01-29 11:53:11",
8        "sender": "0xa83114a443da1cecefc50368531cace9f37fcccb",
9        "sender_lable": "",
10        "receiver": "0x388c818ca8b9251b393131c08a736a67ccb19297",
11        "receiver_lable": "Lido: Execution Layer Rewards Vault",
12        "amount": "0.026817214 ETH",
13        "value": "$73.25",
14        "txn_fee": "0.00005112"
15    }
16]

Limitations

  • Only processes up to 2000 transactions per run to ensure reliable performance
  • Some tokens might have missing data fields marked as "N/A" if not available on

Error Handling

  • The actor throws an InvalidInput exception if the inputs do not meet the specified criteria.
  • Errors during scraping are logged, and the actor attempts to continue with the next page or transaction.

Support

For issues, feature requests, or questions about this actor, please create an issue.

Frequently Asked Questions

Is it legal to scrape job listings or public data?

Yes, if you're scraping publicly available data for personal or internal use. Always review Websute's Terms of Service before large-scale use or redistribution.

Do I need to code to use this scraper?

No. This is a no-code tool — just enter a job title, location, and run the scraper directly from your dashboard or Apify actor page.

What data does it extract?

It extracts job titles, companies, salaries (if available), descriptions, locations, and post dates. You can export all of it to Excel or JSON.

Can I scrape multiple pages or filter by location?

Yes, you can scrape multiple pages and refine by job title, location, keyword, or more depending on the input settings you use.

How do I get started?

You can use the Try Now button on this page to go to the scraper. You’ll be guided to input a search term and get structured results. No setup needed!