00:45:59 so if you can't mix the decoys, u won't be able to make a seraphis tx spending a seraphis enote until 128 seraphis enotes have appeared on the chain .... ? 00:46:41 replace 128 with whatever n the decoy set becomes 00:47:41 we have ~2 years (?)... is there anything we can do to the existing ringct method to make the decoys blendable? 00:48:05 a pre-seraphis ringct mod 00:49:49 no 00:49:55 blendable not possible 00:50:15 sounds like a chicken'n'egg problem then 00:50:29 much the same as anybody trying to clone Monero's code and launch a brand new chain 00:50:49 128 is 4 hours, and you can expect there will be mixed ringct and seraphis txs during the transition 00:50:56 there wouldn't be any outputs available to construct an 11 or 16 member ring on a freshly launched chain 00:51:11 you just wait for block rewards 00:52:29 or for a seraphis tx that spends only legacy enotes 00:52:34 a bunch of txs* 00:52:43 4 hours is *only* 128. so *one* txn could be created after 4 hours passed 00:52:54 on a new chain yes 00:53:06 I forgot you can spend legacy enotes in a seraphis tx 00:53:43 it's just the turnaround, spending a seraphis enote, that you need to wait at most 4hrs before you can make a tx (or more depending on the bin radius) 00:55:45 sounds like it would be much more than 4 hours. if you just go 4 hours and scoop up the 128 latest outputs, it seems to me the true output would be obvious 00:56:22 why 00:57:04 because of imbalance between coinbase and normal enotes? 00:57:06 hm, maybe not. was thinking there'd be 127 outputs with close sequential ages, and one arbitrary age 00:57:47 all seraphis ring members are selected from the DSA (whatever gets defined) 00:58:05 for seraphis inputs; legacy inputs would use the legacy DSA 00:59:01 ok. so in addition to coinbase there'd be new seraphis outs being produced from legacy inputs 00:59:55 would the legacy inputs be in a ring of 16 or ring of 128? 01:01:09 spending legacy will be CLSAG 01:06:41 Rucknium[m]: comment on the legacy DSA post-transition -> it would be nice if at some point the DSA becomes timing agnostic (maybe asymptotically). This way you can't use the ring members in a legacy ring signature to accurately estimate when the signature was constructed. With deferred seraphis membership proofs, you can only defer the seraphis membership proofs (e.g. for multisig) - legacy proofs can't be deferred. If 01:06:41 there is a statistically significant big gap between the 'when was this proof constructed' timing projection between legacy and seraphis membership proofs, that could allow tx fingerprinting. 01:09:31 It's possible to cache the block height when you construct legacy proofs, then later use that height when making seraphis membership proofs to inform the decoy selection. It would be nice to avoid that complexity. 01:12:41 UkoeHB: I don't think the DSA can be "timing agnostic" if it's a mimicking DSA, i.e. the type of DSA that we strive for currently. Partitioning or "single bin" DSA is a different story. 01:12:59 How often would people construct proofs at different times? 01:13:23 offline signing 01:16:12 Has anyone tried to understand this paper? https://moneroresearch.info/index.php?action=resource_RESOURCEVIEW_CORE&id=92 01:16:20 Liang, M., Karantaidou, I., Baldimtsi, F., Gordon, D. S., & Varia, M. (2022). (∈, δ)-indistinguishable mixing for cryptocurrencies. 01:17:00 seems that trying to mimic a "realistic" spending pattern is an exercise in futility 01:17:08 Maybe there is something in there. 01:17:20 best would be random, no discernible pattern 01:17:22 Why is it futile? 01:17:29 because spending patterns change 01:17:53 and also, the current scheme is so highly dependent on the time a txn is created 01:20:33 what if you just cluster all outputs around the same age as the real output? 01:20:56 then it doesn't matter if someone signs a txn and waits a long time before submitting it 01:21:31 All Monero decoy selection algorithms (at least in the reference implementation) have been random as far as I know. I assume you mean to randomly select some distribution. Well, the method by which you select that random distribution would itself define a distribution, so I think we would be back to square 1. 01:22:11 hyc: That is "partitioning". It has been analyzed quite a bit in the Monero research literature. Certainly some people like it a lot. 01:22:42 sorry if I'm retreading old ground then. what are the major objections to it? 01:23:57 Of course it identifies the approximate timing of the sender's previous output. And a "strict" partitioning requires at most a spending waiting time of M where M is the number of outputs that have already been confirmed. 01:24:14 So it would further ossify the 10 block lock and even make it unpredictable. 01:24:32 Also targeted flooding or black marble attacks would be more feasible 01:24:56 I think it's Ok to have partitioning in the conversation, but there are drawbacks of course 01:25:24 hmm. flooding attack in this case requires you to know the age of the output you're attacking, no? 01:26:49 I, an adversary, send an output to a target user. At the same time, I flood the mempool with my own transactions and outputs. Then I can eliminate many decoys (or maybe all if I am lucky and determined) when they go to spend that output. 01:27:08 Ronge, V., Egger, C., Lai, R. W. F., Schröder, D., & Yin, H. H. F. (2021). Foundations of ring sampling. 01:27:14 https://moneroresearch.info/index.php?action=resource_RESOURCEVIEW_CORE&id=19 01:27:18 Yeah, I see. 01:27:31 is the most recent analysis of it. 01:28:29 then those are pretty solid downsides 01:40:22 A lot of the literature on DSAs has favored partitioning since (IMHO) the authors cannot figure out a way to estimate the real spend distribution in order to construct a better mimicking DSA. The whole point of OSPEAD is to develop the first feasible estimator of Monero's real spend distribution that does not rely on a de-anonymized sample like Moser et al. (2018). 01:43:17 Ronge et al. (2021) say "It is therefore reasonable to expect that if the mimicking sampler has access to the true source distribution S, its anonymity 01:43:17 should be close to optimal. In the following, we give an evidence that this is the case." 01:45:03 "We emphasize that although Theorem 6.2 shows that the optimal anonymity is always almost achievable up to a constant factor, the result is mostly of theoretical interest, because it requires the knowledge of an estimation ˆS of the signer distribution S. Even if it is possible to obtain a reasonable estimation ˆS of S, a questionable assumption, S may change over time, e.g., due... 01:45:19 "I'll help 10individuals how to earn $30,000 in 72 hours from the crypto market. But you will pay me 10% commission when you receive your profit. if interested send me a direct message on Telegram by asking me (HOW) for more details on how to get started 01:45:19 https://t.me/+lD4Ec_gRjCljYjNk 01:45:21 to economic bubbles and recessions, and depends on the free will of users. For a good and practical sampler we recommend the partitioning sampler in Section 6.3." 05:17:35 Is anyone looking closer at the zec implementation? They haven't rolled it into production yet? 15:35:03 Rucknium[m]: right, you need knowledge of the true spend distribution, which we can't actually know 15:48:41 A good estimate of it is possible, as I'll show 15:54:24 hmm. that kind of assumes you can estimate the population of real users, as well as the distribution of coins each user owns. 15:54:46 someone with an old wallet may have coins distributed evenly thru time, or only a bunch of old coins 15:55:15 or whatever other distribution. spending patterns per user will be all over the place 15:55:58 Right. But there is an aggregate distribution. 16:02:44 If you want to be on the OSPEAD review panel, you are welcome to be. I think I will be done with the detailed proposal in 1.5 weeks. Then I will give the panel 1-1.5 months for review. 16:06:39 sure, sounds like it'll be quite interesting work 16:07:50 tho it'll be mostly for my own edification. haven't done anything in stats since college 16:11:05 Ok great. So far it's you, Artic Mine and isth mus. I'm hoping to have a biostatistcian on it too. I wrote a section on key statistical concepts to put everyone on mostly the same footing, hopefully. 17:53:24 I won't be available for the meeting wednesday, someone else should pilot it