PaperHub
7.0
/10
Poster3 位审稿人
最低4最高5标准差0.5
5
4
4
3.3
置信度
创新性2.3
质量2.3
清晰度2.7
重要性2.0
NeurIPS 2025

ShortListing Model: A Streamlined Simplex Diffusion for Discrete Variable Generation

OpenReviewPDF
提交: 2025-05-10更新: 2025-10-29
TL;DR

We introduce the Shortlisting Model, a novel simplex-based diffusion model for discrete variable.

摘要

关键词
Discrete Generative Model; Biological Sequence Design;

评审与讨论

审稿意见
5
  • The paper introduces the ShortListing Model (SLM), a novel simplex-based diffusion model for discrete variable generation.
  • Models discrete variable generation as progressive candidate pruning, starting from the full vocabulary and iteratively narrowing down to a single choice.
  • Implements classifier-free guidance
  • Demonstrates competitive performance across text, protein design and DNA promoter and enhancer design

优缺点分析

Strength and weaknesses

Strengths

  • The paper is well written and easy to follow
  • The analysis given of mitigating gradient vanishing is interesting, the solution results in a familiar objective that the authors connect to prior literature which makes the section easy to read.
  • The experiments are substantial, challenging and proves the performances of the model.

Weaknesses

Theoretical concerns

The theory seems solid to me and I don't have much concerns.

Minor concerns

  • The link to the code points to an empty file / error.
  • L46: A reference on BFN is missing.

问题

Questions and suggestions

Question on the approach

  • Could you clarify the candidate set construction? My understanding is that given a data point xx, you build a candidate set x0cx^c_0 (a tensor the size of the vocabulary) with ones for present tokens and zeros otherwise. The corrupting process adds tokens until all are present (xTc=1x_T^c = 1), and generation reverses this. Do you maintain one candidate set per dimension? If so, how do you address potential scaling issues for high-dimensional data?

Question on modelling capabilities and similarties to CTMC

  • What types of transitions can your model capture? Standard discrete diffusion restricts transitions to Hamming-1 distance. How does your approach differ?

  • How does this compare to a CTMC with an absorbing mask? Can your procedure be formulated within that framework?

  • How does it relate to uniform kernel approaches? Conceptually, your approach is similar to uniform discrete diffusion models. The category set at the beginning can be seen as dirac distribution and the category set at the end of the process can be seen as an (unnormalised) uniform distribution. A comparison to standard discrete models would be valuable, especially given that D3PM Absorb is competitive in Table 2 while D3PM Uniform lags behind.

Questions on generation dynamics

  • When generating text, have you been able to observe what elements of the sentence appear first? Or said differently, when the candidate set reduces to the last element, how does the process look like? I am curious to know if the model makes a choice early on in the process on the word it will choose.

  • An additional question is which word are chosen first and which are chosen last in the sentence? Or said differently, based on the position in the sentence, do you have candidate set that converges faster than other?

  • It would be interesting to see a plot cardinality of candidate set against time , visually in form of a heat map, to see which tokens converge faster than others.

局限性

yes

最终评判理由

I believe this is an interesting work on discrete diffusion, with thorough experimentations and strengthened by the rebuttal. Thus I recommend to accept this work.

格式问题

No formatting concerns

作者回复

Q1:About the process of candidate set construction

Thank you for your question. Your understanding of the process for a single discrete variable is correct; for a variable with K categories, we start with its one-hot representation x0cx_0^c and progressively add candidates until we reach the all-one vector xTcx_T^c.

For high-dimensional data, such as a sequence of length L, this concept is applied position-wise, meaning we maintain one K-dimensional candidate set for each of the L positions in the sequence.

We want to highlight that our work places a special emphasis on addressing a critical scaling challenge inherent to simplex-based methods, which is detailed in Appendix C.1.2. We identified that performance can degrade when the vocabulary K exceeds the model's embedding dimension. We successfully mitigated this issue by adjusting the network architecture to be wider, which provides the necessary capacity to represent the high-dimensional simplex. This solution is validated by our strong results on the large-vocabulary OpenWebText dataset, as shown in Table .5.


Q2:Modeling capabilities

Thank you for this insightful question. We agree that our model's transitions are fundamentally built upon a Hamming-1 principle, where the atomic operation is the addition of a single category to the candidate set. The key distinction of our approach is that, as defined by our transition probability in Eq.3, our framework allows multiple new candidates to be added in parallel within a single step. This provides a more flexible and efficient transition path.


Q3:Relationship between SLM and CTMC

We appreciate the reviewer raising this comparison. While there are conceptual parallels in the gradual removal of information, our SLM framework cannot be directly formulated as a standard CTMC with a single absorbing mask.

Instead of augmenting the vocabulary with a single [MASK] token, SLM operates on a much richer state space of 2K12^K - 1 possible candidate sets. The transition path is not to a single absorbing state but follows a highly structured, monotonic inclusion of categories through a hierarchy of these sets, as defined in Eq.2. The generative process is one of guided "candidate pruning" from an all-inclusive state down to a single choice, a mechanism distinct from generating from a single absorbing mask state.


Q4:Relationship between SLM and uniform discrete diffusion approaches

We appreciate the reviewer’s opportunity to clarify this point. In our view, SLM differs from uniform discrete diffusion methods. Firstly, the noisy state in SLM consists of tensors filled with '1's, resulting in identical outcomes for all tokens. In contrast, uniform diffusion achieves a state of uniform noise across the entire sentence, meaning there is still variability among the tokens. From this perspective, the noisy state in SLM is more akin to masked diffusion.

Another characteristic of uniform diffusion is that it retains the possibility for each token to transform into any other token, even at the final sampling step. In contrast, in SLM, these possibilities gradually decrease, meaning that by the last sampling steps, only two or three choices remain for each token. This gradual reduction in possible transformations in SLM not only makes the model more focused and efficient but also enables more controlled and coherent sequence generation.

Furthermore, the key difference between our SLM and uniform transition diffusion lies in the structured, hierarchical construction of candidate sets. Considering the following special case of our model: it starts from a delta distribution (a single state), expands to a set of semantically similar items, and finally encompasses the entire candidate space. This ability to encode a hierarchical property in the input space is the fundamental distinction and a capability that standard uniform transition D3PMs lack.

We strongly agree with the reviewer that it's valuable to discuss the difference between our approach and various types of existing methods. We assure that the contents above will be added to the part of related works in the revision of our paper.


About Minor concerns

Thank you for bringing this to our attention. We will make sure to include the missing reference for BFN in line 46 in our revised version. Unfortunately, due to recent policy changes, we are unable to provide an anonymous link in the rebuttal section. However, we assure you that our code will be released shortly to enable others to fully reproduce our work.


Q5:Visualization of generation process

In response to the reviewer’s insightful request for a more detailed look at our model’s sampling dynamics, we had originally intended to include a detailed heatmap in this rebuttal. However, the current policy restrictions prohibit us from sharing any anonymized or non-anonymized links during this phase. To ensure full transparency, we have therefore provided the complete analysis of the heatmap below and will try to make the figure itself available via public comment (in Markdown) and include it in the revised manuscript PDF. We appreciate your understanding.

One example row of our heatmap:


t=0.0000: ......

......

t=0.7340: \*\*\*\*r\*\*\*\*\*\*num\*\*r\*\*\*\*th\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*a\*o\*\*\*\*\*\*\*\*\*\*\*\*u\*\*\*\*\*\*nt\*\*y\*\*\*s\*\*\*t\*a\*\*\*u\*\*ic\*\*\*\*\*n\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`r`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Blue}{`u`}\color{Blue}{`m`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`r`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`t`}\color{Blue}{`h`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Blue}{`t`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`y`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`i`}\color{Blue}{`c`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`n`}

......

t=1.0000: .....


Setup and Figure Explanation:

We denote any token that has not yet been determined by "*". Unlike masked-diffusion methods (e.g., MDLM), we color-code the asterisks to reflect the current size of the candidate set: Light shades indicate larger candidate sets, Dark shades indicate few remaining choices. Under this scheme, sampling with SLM process resembles a waterfall: as time progresses, each position’s color deepens, signifying convergence toward its final token choice.

Key Observations:

  1. Early Pruning, Late Decisions

In the example, most positions begin eliminating unlikely candidates early on—even if they won’t finalize their choice until later, which ensures that plausible tokens remain under consideration.

  1. Convergence Orders

In the example, short or common tokens(like "th", "ent", "num") tend to converge first as they are easy to predict, while middle and rare tokens often remain ambiguous until the final iterations, as they may be the semantic core of a sentence. These patterns align with human intuition that simple parts of speech and frequent words are decided early, while context-sensitive or infrequent tokens require more sampling time to resolve.


From our perspective, those observations show the advantages SLM has compared to masked diffusion and uniform diffusion. On the one hand, SLM has exact control of coverage in sampling, which is convenient for researchers to judge in what steps things could go wrong(when reasonable choices do not exist in candidate sets). On the other hand, SLM kept the chance for the model to think and correct its mistakes in the last few steps, which will benefit in harder tasks.

评论

I thank the authors for addressing my concerns. I believe this work offers an interesting take on discrete diffusion, with thorough experimentations. After going through the rebuttal and giving it more considerations, I decided to raise my recommendation to an accept.

评论

t=0.6040: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*nt\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*n\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`n`}

t=0.6300: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*nt\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*n\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`n`}

t=0.6560: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*nt\*\*\*\*\*\*s\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*n\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`n`}

t=0.6820: \*\*\*\*r\*\*\*\*\*\*n\*m\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*nt\*\*\*\*\*\*s\*\*\*\*\*\*\*\*\*\*\*\*ic\*\*\*\*\*n\color{Magenta}{`\*\*\*\*`}\color{Blue}{`r`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`m`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{White}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*`}\color{Blue}{`ic`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`n`}

评论

t=0.3880: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{Cyan}{`\*\*\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*\*`}

t=0.4420: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*`}\color{DarkGrey}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Magenta}{`\*`}\color{DarkGrey}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*\*\*\*\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*\*\*`}\color{DarkGrey}{`\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Magenta}{`\*`}

t=0.4980: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*\*\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*\*`}\color{Magenta}{`\*\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*\*\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}

评论

t=0.9440: un\*er\*\*the\*number\*a\*\*the\*\*\*\*i\*ni\*g\*\*f\*th\*\*xi\*\*\*na\*opponen\*\*le\*\*ubsequent\*by\*esstent\*alcou\*ficient\*n\color{Blue}{`un`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`the`}\color{White}{`\*`}\color{Blue}{`number`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`the`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*`}\color{Blue}{`ni`}\color{Magenta}{`\*`}\color{Blue}{`g`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`opponen`}\color{Magenta}{`\*\*`}\color{Blue}{`le`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`ubsequent`}\color{White}{`\*`}\color{Blue}{`by`}\color{White}{`\*`}\color{Blue}{`esstent`}\color{Magenta}{`\*`}\color{Blue}{`alcou`}\color{Magenta}{`\*`}\color{Blue}{`ficient`}\color{White}{`\*`}\color{Blue}{`n`}

t=0.9700: \color{Blue}{`under`}\color{White}{`\*`}\color{Blue}{`\\#the`}\color{White}{`\*`}\color{Blue}{`number`}\color{White}{`\*`}\color{Blue}{`at`}\color{White}{`\*`}\color{Blue}{`the`}\color{Magenta}{`\*\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`inni`}\color{Magenta}{`\*`}\color{Blue}{`g`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`ximb`}\color{Magenta}{`\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`opponent`}\color{Magenta}{`\*`}\color{Blue}{`le`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`ubsequent`}\color{White}{`\*`}\color{Blue}{`by`}\color{White}{`\*`}\color{Blue}{`esstentialcou`}\color{Magenta}{`\*`}\color{Blue}{`ficient`}\color{White}{`\*`}\color{Blue}{`n`}

t=0.9980: \color{Blue}{`under`}\color{White}{`\*`}\color{Blue}{`\\#the`}\color{White}{`\*`}\color{Blue}{`number`}\color{White}{`\*`}\color{Blue}{`at`}\color{White}{`\*`}\color{Blue}{`the`}\color{Magenta}{`\*\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`inni`}\color{Magenta}{`\*`}\color{Blue}{`g`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`ximb`}\color{Magenta}{`\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`opponent`}\color{Magenta}{`\*`}\color{Blue}{`le`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`ubsequent`}\color{White}{`\*`}\color{Blue}{`by`}\color{White}{`\*`}\color{Blue}{`esstentialcoufficient`}\color{White}{`\*`}\color{Blue}{`n`}

t=1.0000: \color{Blue}{`under`}\color{White}{`\*`}\color{Blue}{`\\#the`}\color{White}{`\*`}\color{Blue}{`number`}\color{White}{`\*`}\color{Blue}{`at`}\color{White}{`\*`}\color{Blue}{`the`}\color{White}{`\*`}\color{Blue}{`beginning`}\color{White}{`\*`}\color{Blue}{`of`}\color{White}{`\*`}\color{Blue}{`the`}\color{White}{`\*`}\color{Blue}{`ximbana`}\color{White}{`\*`}\color{Blue}{`opponentale`}\color{White}{`\*`}\color{Blue}{`subsequent`}\color{White}{`\*`}\color{Blue}{`by`}\color{White}{`\*`}\color{Blue}{`esstentialcoufficient`}\color{White}{`\*`}\color{Blue}{`n`}

评论

t=0.8660: un\*er\*\*t\*e\*num\*er\*\*\*\*the\*\*\*\*i\*\*i\*\*\*\*f\*\*h\*\*xi\*\*\*na\*o\*p\*n\*n\*\*\*\*\*\*u\*s\*q\*ent\*by\*e\*ste\*t\*alcou\*\*ic\*\*\*t\*n\color{Blue}{`un`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{White}{`\*`}\color{Blue}{`num`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{Magenta}{`\*\*\*`}\color{White}{`\*`}\color{Blue}{`the`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`h`}\color{Magenta}{`\*\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*`}\color{Blue}{`p`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`q`}\color{Magenta}{`\*`}\color{Blue}{`ent`}\color{White}{`\*`}\color{Blue}{`by`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`ste`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`alcou`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`n`}

t=0.8920: un\*er\*\*t\*e\*number\*a\*\*the\*\*\*\*i\*\*i\*g\*\*f\*\*h\*\*xi\*\*\*na\*opp\*n\*n\*\*\*e\*\*u\*s\*q\*ent\*by\*esste\*t\*alcou\*fic\*\*\*t\*n\color{Blue}{`un`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{White}{`\*`}\color{Blue}{`number`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`the`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*`}\color{Blue}{`g`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`h`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`opp`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*`}\color{Blue}{`e`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`q`}\color{Magenta}{`\*`}\color{Blue}{`ent`}\color{White}{`\*`}\color{Blue}{`by`}\color{Magenta}{`\*`}\color{Blue}{`esste`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`alcou`}\color{Magenta}{`\*`}\color{Blue}{`fic`}\color{Magenta}{`\*\*\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`n`}

t=0.9180: un\*er\*\*t\*e\*number\*a\*\*the\*\*\*\*i\*ni\*g\*\*f\*th\*\*xi\*\*\*na\*opp\*nen\*\*le\*\*u\*sequent\*by\*esstent\*alcou\*fici\*\*t\*n\color{Blue}{`un`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{White}{`\*`}\color{Blue}{`number`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`the`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*`}\color{Blue}{`ni`}\color{Magenta}{`\*`}\color{Blue}{`g`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*`}\color{White}{`\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`opp`}\color{Magenta}{`\*`}\color{Blue}{`nen`}\color{Magenta}{`\*\*`}\color{Blue}{`le`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`sequent`}\color{White}{`\*`}\color{Blue}{`by`}\color{White}{`\*`}\color{Blue}{`esstent`}\color{Magenta}{`\*`}\color{Blue}{`alcou`}\color{Magenta}{`\*`}\color{Blue}{`fici`}\color{Magenta}{`\*\*`}\color{Blue}{`t`}\color{White}{`\*`}\color{Blue}{`n`}

评论

t=0.7860: u\*\*\*r\*\*\*\*\*\*num\*er\*\*\*\*th\*\*\*\*\*i\*\*i\*\*\*\*f\*\*\*\*\*xi\*\*\*\*a\*o\*\*\*n\*n\*\*\*\*\*\*u\*s\*q\*ent\*by\*\*\*s\*\*\*t\*a\*\*\*u\*\*ic\*\*\*\*\*n\color{Blue}{`u`}\color{Magenta}{`\*\*\*`}\color{Blue}{`r`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`num`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{Magenta}{`\*\*\*`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*\*\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`q`}\color{Magenta}{`\*`}\color{Blue}{`ent`}\color{White}{`\*`}\color{Blue}{`by`}\color{Magenta}{`\*\*\*`}\color{Blue}{`s`}\color{Magenta}{`\*\*\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*\*`}\color{Blue}{`u`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`n`}

t=0.8140: u\*\*\*r\*\*\*\*\*\*num\*er\*\*\*\*th\*\*\*\*\*i\*\*i\*\*\*\*f\*\*\*\*\*xi\*\*\*\*a\*o\*p\*n\*n\*\*\*\*\*\*u\*s\*q\*ent\*by\*e\*s\*e\*t\*a\*\*ou\*\*ic\*\*\*\*\*n\color{Blue}{`u`}\color{Magenta}{`\*\*\*`}\color{Blue}{`r`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`num`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{Magenta}{`\*\*\*`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*`}\color{Blue}{`p`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`q`}\color{Magenta}{`\*`}\color{Blue}{`ent`}\color{White}{`\*`}\color{Blue}{`by`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*`}\color{Blue}{`ou`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`n`}

t=0.8400: un\*er\*\*\*\*e\*num\*er\*\*\*\*th\*\*\*\*\*i\*\*i\*\*\*\*f\*\*h\*\*xi\*\*\*na\*o\*p\*n\*n\*\*\*\*\*\*u\*s\*q\*ent\*by\*e\*s\*e\*t\*a\*\*ou\*\*ic\*\*\*\*\*n\color{Blue}{`un`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{White}{`\*`}\color{Magenta}{`\*\*\*`}\color{Blue}{`e`}\color{White}{`\*`}\color{Blue}{`num`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{Magenta}{`\*\*\*`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`h`}\color{Magenta}{`\*\*`}\color{Blue}{`xi`}\color{Magenta}{`\*\*\*`}\color{Blue}{`na`}\color{White}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*`}\color{Blue}{`p`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`q`}\color{Magenta}{`\*`}\color{Blue}{`ent`}\color{White}{`\*`}\color{Blue}{`by`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`e`}\color{Magenta}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*`}\color{Blue}{`ou`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`n`}

评论

t=0.7080: \*\*\*\*r\*\*\*\*\*\*n\*m\*\*\*\*\*\*\*t\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*a\*\*\*\*\*\*\*\*\*\*\*\*\*\*u\*\*\*\*\*\*nt\*\*y\*\*\*s\*\*\*\*\*a\*\*\*u\*\*ic\*\*\*\*\*n\color{Magenta}{`\*\*\*\*`}\color{Blue}{`r`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`n`}\color{Magenta}{`\*`}\color{Blue}{`m`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Blue}{`t`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*\*\*\*\*\*`}\color{White}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Blue}{`u`}\color{Magenta}{`\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`y`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Blue}{`s`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*\*`}\color{Blue}{`u`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`n`}

t=0.7340: \*\*\*\*r\*\*\*\*\*\*num\*\*r\*\*\*\*th\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*a\*o\*\*\*\*\*\*\*\*\*\*\*\*u\*\*\*\*\*\*nt\*\*y\*\*\*s\*\*\*t\*a\*\*\*u\*\*ic\*\*\*\*\*n\color{Magenta}{`\*\*\*\*`}\color{Blue}{`r`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`num`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`r`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*\*\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`y`}\color{Magenta}{`\*\*\*`}\color{Blue}{`s`}\color{Magenta}{`\*\*\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*\*`}\color{Blue}{`u`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`n`}

t=0.7600: u\*\*\*r\*\*\*\*\*\*num\*er\*\*\*\*th\*\*\*\*\*\*\*\*i\*\*\*\*f\*\*\*\*\*x\*\*\*\*\*a\*o\*\*\*\*\*\*\*\*\*\*\*\*u\*s\*q\*\*nt\*\*y\*\*\*s\*\*\*t\*a\*\*\*u\*\*ic\*\*\*\*\*n\color{Blue}{`u`}\color{Magenta}{`\*\*\*`}\color{Blue}{`r`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Blue}{`num`}\color{Magenta}{`\*`}\color{Blue}{`er`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Blue}{`th`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`i`}\color{Magenta}{`\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`f`}\color{White}{`\*`}\color{Magenta}{`\*\*\*\*`}\color{Blue}{`x`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`a`}\color{Magenta}{`\*`}\color{Blue}{`o`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`u`}\color{Magenta}{`\*`}\color{Blue}{`s`}\color{Magenta}{`\*`}\color{Blue}{`q`}\color{Magenta}{`\*\*`}\color{Blue}{`nt`}\color{White}{`\*`}\color{Magenta}{`\*`}\color{Blue}{`y`}\color{Magenta}{`\*\*\*`}\color{Blue}{`s`}\color{Magenta}{`\*\*\*`}\color{Blue}{`t`}\color{Magenta}{`\*`}\color{Blue}{`a`}\color{Magenta}{`\*\*\*`}\color{Blue}{`u`}\color{Magenta}{`\*\*`}\color{Blue}{`ic`}\color{Magenta}{`\*\*\*\*\*`}\color{Blue}{`n`}

评论

t=0.5240: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*n\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*\*\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*`}

t=0.5500: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*n\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*\*\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}

t=0.5760: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*n\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*c\*\*\*\*\*\*\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*\*\*\*`}\color{Magenta}{`\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*\*\*\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*\*\*`}\color{Magenta}{`\*\*\*\*\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*\*\*\*`}\color{Blue}{`n`}\color{Magenta}{`\*\*\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*\*`}\color{Cyan}{`\*\*`}\color{Magenta}{`\*`}\color{Blue}{`c`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}\color{Cyan}{`\*`}\color{Magenta}{`\*\*`}

评论

t=0.2200: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*\*\*`}\color{DarkGrey}{`\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*\*\*`}\color{DarkGrey}{`\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{LightGrey}{`\*\*`}\color{DarkGrey}{`\*`}\color{LightGrey}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*\*\*\*`}\color{DarkGrey}{`\*\*\*\*`}\color{Grey}{`\*\*\*\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Grey}{`\*\*\*\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*\*\*`}

t=0.2760: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*\*\*\*`}\color{Grey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*\*\*`}\color{Grey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*\*\*\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*\*\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*\*\*`}

t=0.3320: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*\*`}\color{Cyan}{`\*\*\*\*\*`}\color{DarkGrey}{`\*\*\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*\*\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*\*`}\color{Cyan}{`\*`}\color{DarkGrey}{`\*`}\color{Cyan}{`\*\*`}\color{DarkGrey}{`\*\*\*\*\*`}\color{Cyan}{`\*`}

评论

t=0.0000: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{#E4E2E2}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}

t=0.0540: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{LightGrey}{`\*\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*\*\*\*`}\color{#E4E2E2}{`\*\*`}\color{LightGrey}{`\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*\*`}\color{LightGrey}{`\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*`}\color{#E4E2E2}{`\*\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*`}\color{#E4E2E2}{`\*\*`}\color{LightGrey}{`\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*`}

t=0.1100: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{#E4E2E2}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*\*\*\*\*\*\*\*\*\*\*`}

t=0.1660: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\color{DarkGrey}{`\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*\*\*\*`}\color{LightGrey}{`\*\*\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*`}\color{DarkGrey}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*`}\color{DarkGrey}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*`}\color{LightGrey}{`\*\*\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*\*\*\*`}\color{LightGrey}{`\*`}\color{DarkGrey}{`\*`}\color{Grey}{`\*\*\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*`}\color{DarkGrey}{`\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*`}\color{LightGrey}{`\*\*`}\color{Grey}{`\*\*\*`}\color{LightGrey}{`\*`}\color{Grey}{`\*\*\*\*`}

审稿意见
4

This paper proposes a novel simplex based discrete diffusion process which operates on simplex centroids. They authors claim that their method demonstrates competitive performance on DNA promoter and enhancer design, protein design, character-level and large-vocabulary language.

优缺点分析

Strengths

The paper proposes an interesting approach for performing discrete diffusion.

Weaknesses

Likelihood Evaluation: The simplex-based approach closely resembles uniform-state diffusion. Therefore, the authors should compare their method with UDLM and DUO. Additionally, on more complex datasets such as OWT, the proposed model clearly underperforms compared to UDLM and DUO, as evident from the PPL numbers reported in Sahoo et al., 2025.

Classifier-Free Guidance:

Several important baselines are missing for classifier-free guidance. The authors should compare their method with existing CFG approaches for discrete diffusion, such as Schiff et al., 2025 and Nisanoff et al., 2025.


References

UDLM: Schiff et al., 2025, “Simple Guidance Mechanisms for Discrete Diffusion Models,” ICLR 2025.

DUO: Sahoo et al., 2025, “The Diffusion Duality,” ICML 2025.

问题

See above

局限性

yes

最终评判理由

I choose to increase the score because the authors demonstrate that this method is better than UDLM and DUO that operate purely in the discrete space during the rebuttal.

格式问题

none

作者回复

W1:More baselines of UDLM and DUO

Thanks for the useful suggestions of adding more baselines. We include baselines of enhanced uniform-state diffusion(UDLM and DUO) in several tasks, the result of which could be summarized as follows:

On Language Modeling:

Updated Table.1 : Bits Per Character(BPC) on Text8 Test Set

CategoryModelBPC
AutoregressiveTransformer AR1.23
AR Argmax Flow1.39
AR Discrete Flow1.23
Any-order AutoregressiveARDM≤1.43
MAC≤1.40
Continuous DiffusionPlaid≤1.48
Discrete DiffusionMult. Diffusion≤1.72
D3PM Uniform≤1.61
D3PM Absorb≤1.45
SEDD Absorb≤1.41
MDLM≤1.39
MD4≤1.37
UDLM≤1.60
DUO≤1.42
Simplex ApproachesBFN≤1.41
DFM≤1.41
SFM1.39
SLM(L-simple)≤1.42
SLM(L-weight)≤1.38

On Text8, uniform-state diffusion variants UDLM and DUO improve over D3PM-Uniform (≤1.61), achieving ≤1.60 and ≤1.42, respectively, yet they still lag behind MDLM (≤1.39) and SLM (L-weight ≤1.38).

On DNA Sequence Modeling:

Updated Table. 2: DNA promoter design

ModelMSE ↓
Bit Diffusion (bit enc)0.041
BFN0.0405 ± 0.0003
Bit Diffusion (one-hot)0.040
D3PM-uniform0.038
DDSM0.033
Autoregressive0.034 ± 0.001
Dirichlet FM0.034 ± 0.004
UDLM0.030 ± 0.001
DUO0.042 ± 0.002
Fisher-Flow0.029 ± 0.001
SLM0.0265 ± 0.0006

For promoter MSE, UDLM (0.030 ± 0.001) shows clear gains over D3PM-Uniform (0.038) and several earlier approaches, but it remains above SLM (0.0265 ± 0.0006). DUO (0.042 ± 0.002) performs worse in this setting for one possible reason: it was not paired here with discrete consistency distillation in a comparison of all schemes training from scratch.

On Protein Sequence Modeling:

Updated Experimental results on Protein sequence design:

ModelDesignability ↑Diversity ↓Fitness ↓Self-Consistency ↓
ESM1-43M33.920.16462.74611.20
ESM2-150M32.440.15942.50110.03
EvoDiff-OADM-38M34.620.17822.85711.02
EvoDiff-D3PM-38M31.990.18082.88311.51
MDLM-38M39.470.16342.2308.562
DFM-Uniform-38M32.200.17232.78311.44
DFM-Mask-38M36.160.16502.1818.809
AR-38M38.960.16892.77510.41
UDLM-38M38.790.18332.52610.37
DUO-38M28.220.13952.97415.35
SLM-38M41.080.15982.1278.286

All models were trained for 100,000 steps on UniRef50. UDLM-38M delivers slightly better performance comparable to AR but underperforms MDLM. DUO exhibits high diversity but falls short on other perspectives. These results suggest that SLM strikes a more robust equilibrium in designing proteins.

These additional experiments confirm that uniform-state diffusion methods yield consistent improvements across tasks compared to D3PM, yet highlight challenges that they still underperform masked diffusion and SLM. SLM’s more resilient performance may offer directions for further improving uniform discrete diffusion models like UDLM and DUO.

W2:Results on OpenWebText

We thank the reviewers for their insightful comments regarding performance on the OpenWebText language modeling task. We agree that simplex-based approaches have historically faced scalability challenges with large vocabularies. Our work addresses this directly with two key contributions:

  1. Our proposed framework, SLM, yields a significant performance boost compared to previous simplex-based methods.
  2. Crucially, we are the first to identify the root cause of these scaling issues, with a detailed analysis provided in Appendix C.1.2.

While we recognize that further effort is required to fully optimize performance in this area, we believe this diagnosis and the resulting improvements are a significant contribution to this line of research.

Furthermore, we respectfully suggest that the evaluation of discrete diffusion models should not be limited to language tasks alone. In domains such as biological sequence design, the absence of a strict left-to-right order gives non-autoregressive models a distinct advantage, making them a more natural fit than autoregressive models. As our experiments highlight, SLM exhibits notably superior performance on protein and DNA design tasks, where it has the potential to become a leading paradigm.

In summary, our evaluation spans a diverse range of tasks—encompassing both large and small vocabularies, as well as language and non-language domains. We believe this comprehensive performance demonstrates our contribution to a versatile generative modeling framework. We agree with the reviewers that more application-specific effort is necessary to fully realize SLM's potential.

W3:More baselines for CFG

We apologize for the typographical error in Section 3.7. The correct formulation is given by:

NNθ^(xtc,t)=exp(γlogNNθ(xtc,t,cls) \hat{NN_{\theta}} (\mathbf{x^c_t}, t) = \exp ( \gamma \log \text{NN}_{\theta}(\mathbf{x^c_t}, t, \text{cls})

\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ + (1 - \gamma)\log NN_{\theta} (\mathbf{x^c_t}, t, K) \)

Similar to D-CFG (Schiff et al., 2025) and PFG (Nisanoff et al., 2025), our CFG implementation decomposes

logNNθ^γ(xtc,t)\log⁡ \hat{\text{NN}_\theta}^{\gamma}(\mathbf{x^c_t},t)

into a linear combination of

γlogNNθ^(xtc,t,cls)\gamma \log⁡ \hat{\text{NN}_\theta}(\mathbf{x^c_t},t,\text{cls})

and (1γ)logNNθ^(xtc,t,K)(1-\gamma)\log⁡ \hat{\text{NN}_\theta}(\mathbf{x^{c}_t},t,K), enabling classifier-free guidance without the need for an auxiliary predictor.

In response, we have now included both UDLM and DUO in our CFG experiments. Updated results on DNA enhancer design are shown in Table below: DNA Enhancer design with classifier-free guidance.

ModelMel FBD (↓)FB FBD (↓)
BFN CFG2.3 ± 0.12.3 ± 0.2
Dirichlet FM CFG1.9 ± 0.41.0 ± 0.3
UDLM CFG0.3 ± 0.13.1 ± 0.1
DUO CFG1.8 ± 0.42.1 ± 0.3
SLM CFG1.4 ± 0.11.0 ± 0.1

UDLM works well after CFG in Mel dataset but struggles on FB, while DUO falls behind both UDLM and our SLM method. Therefore, it reveals SLM with fine overall performance.

评论

Thank you for including the new experiments. Could the authors please clarify the exact training procedure used for DUO?

The fact that DUO performs worse than UDLM on Mel FBD and other metrics for protein sequence modeling is very suspicious and suggests a potential issue with how DUO was trained. Could the authors confirm whether they used the curriculum learning strategy during training? Additionally, did they compute the diffusion transformation operator correctly? This is a crucial step because it helps you determine the end points of noise schedule during training. Even if curriculum learning was omitted, DUO should theoretically outperform UDLM due to its use of a Rao-Blackwellized ELBO. For simplicity, the authors should train DUO w/o curriculum learning.

DUO (0.042 ± 0.002) performs worse in this setting for one possible reason: it was not paired here with discrete consistency distillation in a comparison of all schemes training from scratch.

Discrete Consistency distillation shouldn't be used in this case as it's a method that speeds up sampling by compromising with generation quality.

It's pretty interesting that the proposed method is better than UDLM. Given that DUO is the current SOTA in uniform state discrete diffusion, the authors must provide a thorough comparison with DUO. I'm happy to consider raising my scores if the authors can address my concerns.

评论

We appreciate the reviewer’s thoughtful comments and share your concern regarding DUO’s unexpected underperformance relative to UDLM on DNA and protein sequence modeling benchmarks.

Training Configuration and Implementation

Firstly, we want to clearify the training configuration when reproducing DUO's results. In our previous experimental results, we incorporated the official DUO module (including the DUO and DUO_BASE class in file algo.py) directly into our framework without any modification to ensure faithful reproduction of DUO. We also applied the standard curriculum learning(call it CL)—from start to the max training steps—exactly as in the DUO’ repository. This alignment gives us confidence that the diffusion transformation operator was computed correctly.


Additional Optimization Efforts and the modified Results

Beyond curriculum learning (CL), we conducted a comprehensive hyperparameter search and alternative greedy sampling strategies to maximize DUO’s performance in our biological tasks. We report the newest results in the updated tables as follows:


DNA Promoter Design

ModelMSE ↓
Autoregressive0.034 ± 0.001
UDLM0.030 ± 0.001
DUO w CL0.029 ± 0.003
DUO w/o CL0.027 ± 0.0002
SLM0.0265 ± 0.0006

DNA Enhancer Design

ModelMel FBD (↓)FB FBD (↓)
UDLM CFG0.3 ± 0.13.1 ± 0.1
DUO w/o CL CFG0.2 ± 0.13.0 ± 0.2
DUO w CL CFG0.2 ± 0.13.5 ± 0.3
SLM CFG1.4 ± 0.11.0 ± 0.1

Protein Sequence Design

ModelDesignability ↑Diversity ↓Fitness ↓Self-Consistency ↓
AR38.960.16892.77510.41
UDLM38.790.18332.52610.37
DUO w CL39.080.13412.4389.76
DUO w/o CL39.530.12532.1469.49
SLM41.080.15982.1278.286

Explanation: Why DUO without Curriculum May Excel

Now we've confirmed the reviewers claim that DUO outperforms UDLM, even if without the help of CL. It is suggested in DUO's paper that CL benefits to reduce gradient variance while training, while including CL didn't result in performance boost in our experiments. Additionlly, we offer two possible reasons:

1. Vocabulary Constraints

DUO’s low-variance curriculum relies on tempered softmax to smooth the argmax operation, which provides more benefit when the vocabulary size K is large(e.g. OpenWebText). In our settings: DNA (K=4) and protein (K=20), the advantages of tempered softmax are diminished, making CL less impactful or even slightly detrimental.

2. Gap between Training Dynamics and Downstream Metrics

Different from text modeling, Biological sequence models often exhibit a rapid drop in loss early in training, followed by a long plateau, while downstream properties (e.g., functional property scores) continue to improve. Thus it may take for additional steps of training after CL to show CL's effect in downstream tasks, which is unexplored in our experiments due to the time limit of the rebuttal period.

评论

As we approach the closing of the author–reviewer dialogue period, we respectfully request your final review of our manuscript in light of the clarifications and additional information we have just provided. Your thorough assessment is highly appreciated, and an early response would be most helpful.

With sincere thanks for your time and expertise.

审稿意见
4

This paperproposes simplex-based diffusion model,Shortlisting Model (SLM)。SLM operates on simplex centroids, reducing generation complexity and enhancing scalability,and incorporates a flexible implementation of classifier-free guidance, enhancing unconditional generation performance.

优缺点分析

This method does not demonstrate outstanding performance in Language Modeling experiments and exhibits a certain gap compared to Autoregressive algorithms. While it achieves better results in DNA Sequence Design tasks, these experiments neither involve Autoregressive algorithms nor include MDLM—a method sharing similar principles and demonstrating comparable effectiveness in Language Modeling experiments.

问题

Why are different models selected for comparison across different experiments, particularly given the absence of Autoregressive algorithms—which demonstrate superior performance in Language Modeling—and MDLM, which shares conceptual similarities and has shown comparable effectiveness in such tasks?

局限性

yes

最终评判理由

The authors' rebuttal has alleviated my concerns and doubts.

格式问题

None

作者回复

W1:Performance gap between Autoregressive Models

We thank the reviewer for this insightful comment.

It is true that, to date, autoregressive models remain the gold standard for unconditional, left‑to‑right language modeling, and no discrete diffusion approach has yet fully closed this gap under the same data‑and‑task constraints. The original SEDD paper once achieved that because of extra training data, which was then corrected by MDLM with a more fair comparison.

Given the result in Table. 1, SLM nevertheless surpasses all simplex‑based methods and most existing discrete diffusion variants (both masked and uniform). SLM substantially narrows the gap to autoregressive performance—an improvement not achieved that high by prior discrete diffusion schemes.

While we acknowledge that the left-to-right inductive bias has been highly effective for autoregressive (AR) models in language modeling, distinguishing them from non-autoregressive approaches, this same assumption becomes a significant limitation in other critical domains. For instance, in biological sequence modeling (e.g., proteins and DNA), non-autoregressive models often exhibit superior performance. This reflects the broad potential for discrete diffusion models to make a significant impact in a wide range of scientific fields, far beyond just language modeling.

Consistent with this, SLM outperforms autoregressive baselines in DNA and protein sequence modeling. Shown in the aggregated table below, this unified treatment of general discrete variables highlights both the versatility and innovation of our approach.

TaskEvaluation MetricARMDLMUDLMSLM
Text Modeling (Text8)BPC ↓1.23≤1.39≤1.60≤1.38
DNA Sequence Modeling (Promoter Design)MSE ↓0.034 ± 0.0010.028 ± 0.0010.030 ± 0.0010.0265 ± 0.0006
Protein Sequence Modeling(Uniref50)Designability ↑38.9639.4738.7941.08
Diversity ↓0.16890.16340.18330.1598
Fitness ↓2.7752.2302.5262.127
Self-Consistency ↓10.418.56210.378.286

Aggregated Table: SLM's performance compared to AR, MDLM and UDLM baselines.


W2: Baselines on DNA sequence design tasks

Sorry for the lack of clarity. The Autoregressive Model baseline was marked as "Language Model" in the experiment of DNA sequence design, which is directly copied from the Dirichlet Flow-Matching paper.

Given the result in Aggregated Table, SLM achieves better results than Autoregressive models and MDLM baselines in DNA modeling: MSE 0.0265±0.0006 compared to AR's 0.034±0.001 and MDLM's 0.028 ± 0.001, which we regard as evidence to solve the reviewer's concerns.


Q1: Different Selection of Baselines under different experimental tasks

We sincerely appreciate the reviewer's constructive feedback regarding consistent baseline comparisons. Therefore, we provide an aggregated table above to show the overall performance on text, DNA sequence and protein sequence modeling among 3 main baselines: Autoregressive, masked discrete diffusion models and uniform discrete diffusion models. The full set of comparisons of each task could also be seen in our response to reviewer unN3.

As an observation, evaluation with consistent baselines still proves SLM's effectiveness as we declared in response to W1.

评论

Thank you for acknowledging our previous message and for the time you have already dedicated to reviewing our work. We fully understand the many demands on your time at this stage of the process. If possible, we would be truly grateful if you could take a moment to revisit our latest clarifications, including our reply to other reviewers, in case they may offer new perspectives that could assist you before reaching the final assessment .

We sincerely appreciate your efforts and understanding.

评论

With the author–reviewer discussion phase nearing its end, we would be most grateful if you could take a moment to consider our most recent replies before providing your final evaluation of our submission. Your insights have been invaluable, and we sincerely hope for your prompt feedback.

最终决定

The authors introduce a new shortlisting diffusion model, which differs from standard masked diffusion and uniform discrete diffusion approaches. Reviewers are enthusiastic about this novel modeling framework, and I am also inclined to recommend acceptance.

That said, I encourage the authors to include additional experiments and incorporate comparisons with existing methods, as presented during the rebuttal, in the final version.