Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrating csv from cointracking and importing csv exported from kraken #289

Open
PanosChtz opened this issue Mar 26, 2025 · 7 comments
Open

Comments

@PanosChtz
Copy link

I was using cointracking for the last 3 years. I exported the csv from cointracking, here is a sample:

"Type","Buy","Cur.","Sell","Cur.","Fee","Cur.","Exchange","Group","Comment","Date","Tx-ID"
"Deposit","0.00441641","BTC","","","0.00000000","","Gemini","","","2025-01-26 15:22:08","282881076769"
"Withdrawal","","","1.03987000","LTC","0.00001254","LTC","Gemini","","","2025-01-26 14:39:07","282876626620"
"Withdrawal","","","0.08623000","LTC","0.00001254","LTC","Gemini","","","2025-01-21 04:13:20","280917939621"
"Other Income","0.00032926","BTC","","","0.00000000","BTC","Gemini","","","2024-08-23 03:10:27","237856380292"
"Other Income","0.00033326","BTC","","","0.00000000","BTC","Gemini","","","2024-08-22 03:10:26","237714027863"
"Other Income","0.00033463","BTC","","","0.00000000","BTC","Gemini","","","2024-08-21 03:10:26","237532794776"
"Other Income","0.00034071","BTC","","","0.00000000","BTC","Gemini","","","2024-08-20 03:10:27","237369199255"

I now want to migrate from cointracking to rp2. The past year I only used kraken to trade. Here is the exported csv sample for 2024:

"txid","refid","time","type","subtype","aclass","asset","wallet","amount","fee","balance"
"LFTGXD-RWVPJ-J645GJ","TA4RTP-4TUW5-G5DGTA","2024-03-19 19:43:31","trade","","currency","BTC","spot / main",-0.0012570000,0,0.0025137244
"LS4N2A-5DK5R-DRB7ZL","TA4RTP-4TUW5-G5BGDA","2024-03-19 19:43:31","trade","","currency","LTC","spot / main",1.0000000000,0,1.0772481500
"LEMCZY-4QN7D-AHHLOF","FTS9z1b-zVNdUE1trYlXT8OWPvPcWf","2024-03-19 19:46:11","withdrawal","","currency","LTC","spot / main",-0.9212000000,0.0020000000,0.1540481500
"LPXKOL-ZXZJW-435RQ4","TB7WK4-ITPP3-LWGEI4","2024-04-08 16:49:02","trade","","currency","BTC","spot / main",-0.0021789300,0,0.0003347944
"LZSCKS-ULSZZ-JUNNOQ","FTGrDWz-T2pDQS2usl6ffwJOZ8zcjV","2024-12-01 20:07:00","deposit","","currency","BTC","spot / main",0.0226436100,0,0.0297701154

Since the kraken csv is much smaller, I tried to work with that first. Here is the .ini file:

[dali.plugin.input.csv.manual]
in_csv_file = krakensample.csv

decimal_separator = .
date_format = %Y-%m-%d %H:%M:%S
timezone = UTC

# Map Kraken columns to DaLI fields
column_map.unique_id = txid
column_map.timestamp = time
column_map.transaction_type = type
column_map.asset = asset
column_map.crypto_in = amount
column_map.crypto_fee = fee
column_map.exchange = wallet

# Provide placeholders for missing columns
column_map.spot_price = __unknown
column_map.fiat_in_no_fee = __unknown
column_map.fiat_in_with_fee = __unknown
column_map.fiat_fee = __unknown
column_map.fiat_ticker = __unknown
column_map.notes = __unknown

# Map Kraken operations to DaLI transaction types
operation_map.trade = TRADE
operation_map.withdrawal = WITHDRAWAL
operation_map.deposit = DEPOSIT
operation_map.fee = FEE

# General Configuration
country = us

However when I execute it I get the following error:

$ dali_us -s -o output -p test_ test_config.ini
INFO: Country: us
INFO: Initialized input plugin 'dali.plugin.input.csv.manual'
INFO: No pair converter plugins found in configuration file: using default pair converters.
INFO: Reading crypto data using plugin 'dali.plugin.input.csv.manual'
ERROR: Fatal exception occurred:
Traceback (most recent call last):
  File "/home/user/.local/lib/python3.10/site-packages/dali/dali_main.py", line 168, in _dali_main_internal
    result_list = pool.map(_input_plugin_helper, input_plugin_args_list)
  File "/usr/lib/python3.10/multiprocessing/pool.py", line 367, in map
    return self._map_async(func, iterable, mapstar, chunksize).get()
  File "/usr/lib/python3.10/multiprocessing/pool.py", line 774, in get
    raise self._value
  File "/usr/lib/python3.10/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "/usr/lib/python3.10/multiprocessing/pool.py", line 48, in mapstar
    return list(map(*args))
  File "/home/user/.local/lib/python3.10/site-packages/dali/dali_main.py", line 226, in _input_plugin_helper
    plugin_transactions = input_plugin.load(country)
  File "/home/user/.local/lib/python3.10/site-packages/dali/plugin/input/csv/manual.py", line 99, in load
    self._load_in_file(result)
  File "/home/user/.local/lib/python3.10/site-packages/dali/plugin/input/csv/manual.py", line 117, in _load_in_file
    raise ValueError(f"Not enough columns: the {self.__in_csv_file} CSV must contain {self.__IN_NOTES_INDEX} columns.")
ValueError: Not enough columns: the krakensample.csv CSV must contain 12 columns.
INFO: Log file: ./log/rp2_2025_03_26_02_32_40_160702.log
INFO: Generated output directory: output
INFO: Done

What does this error mean? From the documentation it is not clear if/how I can use exported csv's from exchanges or other sources.

@eprbell
Copy link
Owner

eprbell commented Mar 26, 2025

There are a few problems with your approach:

  • you're trying to load a Kraken-generated file with the manual plugin: it's unlikely that the Kraken format maps 1-1 to the manual plugin format (even after reshuffling column positions). The correct approach would be to write a Kraken data loader plugin for DaLI, and then let DaLI generate the correct format for RP2.
  • the part of your configuration with column_map.unique_id, etc. is not in the correct configuration format for the Manual plugin: read its documentation to understand the correct format.

@PanosChtz
Copy link
Author

The reason I'm not using the kraken api key approach is because of my previous transactions, which go all the way back to 2013, and are in the csv exported from cointracking.info. I guess this would create a mess because many transactions would be imported both from the cointracking csv file and from kraken api and would become duplicates (not sure if these are detected or filtered out).
I read the documentation but it is not clear if the csv needs to have a specific structure or not. For example, do I need to break it down into 3 csv's (in, out, intra)? What are the minimum expected fields? (The error message says "must contain 12 columns" - what are these columns?

Also what about the cointracking csv? This contains like 11 years of transaction history, including other exchanges which are now out of business, local wallets etc.

@eprbell
Copy link
Owner

eprbell commented Mar 26, 2025

The manual plugin documentation explains that 3 CSV files are required (just read the first paragraph): these CSV files must be referenced in the [dali.plugin.input.csv.manual <qualifiers>] section of the .ini file. The format of each of these 3 files is described immediately below that.

Kraken, Cointracking or any other source of transactions, whether REST or CSV, require a DaLI data loader plugin to normalize the data in RP2 format.

As for the question of how to switch from another software to RP2, check the relevant FAQ. If you're using DaLI to generate RP2 input and you have used other software in previous years, you may have to remove the transactions that have been already handled by other software from the generated RP2 input.

@PanosChtz
Copy link
Author

Ok, but I'm still confused on what I need to do with the 2 csv files I have. Should I write a program myself to first break down each into 3 csv's (in, out, intra) and rearrange the columns for each as in the test_manual_x.csv's in https://github.com/eprbell/dali-rp2/tree/main/input ?

@eprbell
Copy link
Owner

eprbell commented Mar 26, 2025

You have two options:

  • use the manual CSV plugin: take your transactions (in whatever format they are) and convert them to the manual CSV format. You could do this either manually or with a conversion script.
  • write one (or more) data loader plugin, which understands the native format data and transforms it into DaLI transactions in memory.

Then DaLI will take care of everything else: normalize the data, call the transaction resolver, generate RP2 output.

@PanosChtz
Copy link
Author

Ok, so as I understand these 2 options, the 1st option is what I was thinking of doing in my previous reply (i.e. break down each of my csv's in 3 parts and rearrange columns to match the same structure as the test_manual_x.csv's), while the 2nd option is forking dali-rp2 and write another plugin in addition to those already existing 14 plugins. Is my understanding correct?
If yes, I think the 1st option is simpler..

@eprbell
Copy link
Owner

eprbell commented Mar 27, 2025

Yes, that's correct, with the exception that option 1 may be slightly more involved than just rearranging columns. Depending on how the native format looks like you may also need to sum, subtract or otherwise process native columns to represent the semantics of the manual CSV format. Also read the unique id FAQ.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants