GMGN Wallet Holdings Scraper

GMGN Wallet Holdings Scraper

GMGN.ai Wallet Scraper: Track crypto portfolios across Ethereum, BSC, Base, Solana, Blast, Tron. Analyze token values, PnL metrics, and transaction history. Export filtered wallet data for trading strategies and investment insights. Multi-address scanning with real-time profit/loss analysis

DEVELOPER_TOOLSAUTOMATIONOTHERApify

GMGN Wallet Holdings Scanner

GMGN Wallet Holdings Scraper

GMGN Wallet Holdings Scraper is a powerful web scraping tool developed for the Apify platform. This tool automatically extracts, analyzes, and allows you to monitor token holdings from crypto wallets on the GMGN.ai platform. Operating across different blockchain networks (Ethereum, BSC, Base, Solana, Blast, and Tron), this tool collects information about all crypto assets, their prices, values, and profit/loss information in a specific wallet.

Why Should You Use GMGN Wallet Scraper?

GMGN Wallet Scraper saves you hours by automating manual data collection processes and provides you with access to the most up-to-date wallet data. This tool offers you the following advantages:

  • Time Saving: Saves you hours by automating manual data collection processes
  • Comprehensive Data: Complete data set including prices, transaction volumes, profit/loss, and total amount
  • Multi-Blockchain Support: Works on Ethereum, BSC, Base, Solana, Blast, and Tron networks
  • Customizable Data Collection: Ability to sort and filter according to your needs

Features

  • Extracts all token holdings from crypto wallets on GMGN.ai
  • Can filter cryptocurrencies based on various criteria (last transaction time, unrealized profit, total value, etc.)
  • Can scarape multiple wallet addresses at once
  • Stores collected data in Apify data storage and allows you to export it in various formats (JSON, CSV, Excel)

Use Cases

  • Portfolio Tracking: Track your own or others' crypto portfolios
  • Investment Analysis: Analyze token values to identify potential investment opportunities
  • Market Research: Explore asset distribution in specific wallets
  • Data Science Projects: Create comprehensive datasets for crypto markets
  • Trading Strategies: Collect real-time data for your algorithm-based trading strategies

Usage

  1. Run this actor in the Apify console.
  2. Provide the necessary inputs:
    • walletAddresses: Wallet addresses to scan (you can enter multiple addresses)
    • chain: Blockchain network to scan (eth, bsc, base, sol, blast, tron)
    • limit: Number of tokens to scrape for each wallet
    • sortBy: Criterion for sorting results
    • sortDirection: Sort direction
    • includeSmallAmounts: Show tokens with small amounts
    • includeSoldTokens: Show completely sold tokens
    • includeRecentTx: Include transaction count in the last 30 days
    • excludeAbnormalTokens: Hide tokens with abnormal activity

Example Input

1{
2  "walletAddresses": ["0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045"],
3  "chain": "eth",
4  "limit": 50,
5  "sortBy": "last_active_timestamp",
6  "sortDirection": "desc",
7  "includeSmallAmounts": true,
8  "includeSoldTokens": true,
9  "includeRecentTx": false,
10  "excludeAbnormalTokens": false
11}

Output

The collected data is saved to the Apify dataset. The output data includes the following fields:

  • wallet_address: Wallet address
  • chain: Blockchain network
  • token.address: Token address
  • token.symbol: Token symbol
  • token.name: Token name
  • token.decimals: Token decimals
  • token.logo: Logo URL
  • token.price_change_6h: 6-hour price change
  • token.is_show_alert: Whether an alert is shown
  • token.is_honeypot: Whether it's a honeypot
  • balance: Token balance
  • usd_value: USD value
  • price: Price
  • realized_profit_30d: 30-day realized profit
  • realized_profit: Realized profit
  • realized_pnl: Realized PnL
  • realized_pnl_30d: 30-day realized PnL
  • unrealized_profit: Unrealized profit
  • unrealized_pnl: Unrealized PnL
  • total_profit: Total profit
  • total_profit_pnl: Total profit PnL
  • avg_cost: Average cost
  • avg_sold: Average sold
  • buy_30d: 30-day purchases
  • sell_30d: 30-day sales
  • sells: Sales
  • cost: Cost
  • position_percent: Position percentage
  • last_active_timestamp: Last active time
  • history_sold_income: Historical sales income
  • history_bought_cost: Historical purchase cost
  • start_holding_at: Time started holding
  • end_holding_at: Time stopped holding
  • liquidity: Liquidity
  • total_supply: Total supply
  • wallet_token_tags: Wallet token tags

Example Output

1{
2  "token": {
3    "address": "0x7e877b99897d514da01bd1d177e693ec639961af",
4    "token_address": "0x7e877b99897d514da01bd1d177e693ec639961af",
5    "symbol": "OGGY",
6    "name": "Oggy Inu",
7    "decimals": 9,
8    "logo": "https://www.dextools.io/resources/tokens/logos/ether/0x7e877b99897d514da01bd1d177e693ec639961af.png",
9    "price_change_6h": "253927182291795.3311568439751095",
10    "is_show_alert": false,
11    "is_honeypot": false
12  },
13  "balance": "8000000",
14  "usd_value": "19086103437198.844",
15  "realized_profit_30d": "0",
16  "realized_profit": "0",
17  "realized_pnl": "0",
18  "realized_pnl_30d": "0",
19  "unrealized_profit": "0",
20  "unrealized_pnl": "0",
21  "total_profit": "0",
22  "total_profit_pnl": "0",
23  "avg_cost": "0",
24  "avg_sold": "0",
25  "buy_30d": 0,
26  "sell_30d": 0,
27  "sells": 0,
28  "price": "2385762.9296498555",
29  "cost": "0",
30  "position_percent": "1",
31  "last_active_timestamp": 0,
32  "history_sold_income": "0",
33  "history_bought_cost": "0",
34  "start_holding_at": null,
35  "end_holding_at": null,
36  "liquidity": "6129.82303856382",
37  "total_supply": "420000000000",
38  "wallet_token_tags": null,
39  "wallet_address": "0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045",
40  "chain": "eth"
41}

This example output shows the structured data of a single cryptocurrency. The actual output will be a list of similar objects for all processed cryptocurrencies.

Notes

  • The collected data is stored in Apify's default data store.

Frequently Asked Questions

Is it legal to scrape job listings or public data?

Yes, if you're scraping publicly available data for personal or internal use. Always review Websute's Terms of Service before large-scale use or redistribution.

Do I need to code to use this scraper?

No. This is a no-code tool — just enter a job title, location, and run the scraper directly from your dashboard or Apify actor page.

What data does it extract?

It extracts job titles, companies, salaries (if available), descriptions, locations, and post dates. You can export all of it to Excel or JSON.

Can I scrape multiple pages or filter by location?

Yes, you can scrape multiple pages and refine by job title, location, keyword, or more depending on the input settings you use.

How do I get started?

You can use the Try Now button on this page to go to the scraper. You’ll be guided to input a search term and get structured results. No setup needed!