Both documents end with sample calculations that look like this:
One policyID, no asset names
6 + FLOOR (((1 * 12) + 0 + (1 * 28) + 7) / 8) = 11
No explanation what 6 and 7 are and it’s no quite clear what 3 other components inside FLOOR are. The expectation is that an app developer will deduct how to calculate the length of the native token portion for any given UTxO.
Am I missing more clear explanation somewhere?
If not, can somebody show the calculation for this case:
a policy A with no token name + a policy B with a token name ‘43484f43’ + a policy B with a token name ‘524245525259’
Maybe more cases will help me to deduce the calculation algorithm.
Things are much clearer in the morning. The constant 6 is still a mystery to me. The constant 7 makes sense as we rounding UP. The rest of the formula components are actually well described in that second document at Min-Ada-Value Requirement — Cardano Ledger 1.0.0 documentation
Thank you for posting this. Could you please point to the doc that says to use coinSize=2? The docs I read are saying that it will change to 2 in some future implementation. I guess the future has arrived.
In my implementation i’m not using neither coinSize nor minUTxOValue. Once you know the size of the token bundle, the min amount is (utxoEntrySizeWithoutVal + bundle size) * utxoCostPerWord. The utxoCostPerWord is a protocol parameter.
By the way, doing this in a shell script is not super convenient. Consider this UTxO, where you would need to iterate over the assets to estimate the formula variables:
Use whatever language you please as this part is always calculated off chain anyway. My minFeeAdaCalc function is in C++ currently. Note: It does not need to be exact. The operative keyword here is “minimum” so whatever function you roll could include a fudge factor then you just refund the unspent back to the input address as an additional output