(05-07-2025, 02:23 AM)ReneZ Wrote: You are not allowed to view links. Register or Login to view. (04-07-2025, 07:14 PM)nablator Wrote: You are not allowed to view links. Register or Login to view.satisfied with an explanation of why "the whole thing cannot work" (dixit ReneZ).
Very brief:
The high level of repetitions is not the most conspicuous artifact of the text. Perhaps not even the second most conspicuous, but probably the third. This is so subjective that I don't want to argue about it.
Using this as the prime method for the text generation, while it does nothing towards the most conspicuous aspects (low entropy, word pattern - same thing really) is my biggest problem.
But this risks getting us off topic.
Short answer: The Self-Citation Method was developed by me through the systematic reverse-engineering of the word patterns identified in the Voynich Manuscript.
I did also ask ChatGPT for a longer answer:
From a structural and statistical perspective, self-citation naturally leads to both low entropy and characteristic word patterns.
1.
What is the Self-Citation Method?
It refers to generating new text by:
- Copying previously written words or fragments
- Making small, predictable modifications
- Repeating this process recursively over the growing text
2.
Why Does That Lead to Low Entropy?
Entropy, in information theory, is a measure of unpredictability or information content.
Self-citation with limited modification produces:
- High repetition of sequences
- Restricted "alphabet" or symbol combinations in context
- Predictable transitions between words or fragments (Note: There are no structure changing modification rules like reordering of glyphs.)
- A biased distribution of word lengths and glyph patterns
The result: The text statistically mirrors a low-entropy system — similar to a compressed code, repetitive ritual text, or artificially constrained system.
3.
Why Does That Produce Word Patterns?
The Voynich manuscript exhibits:
- Clusters of similar-looking "words"
- Systematic variations
- Repeated fragments that mimic prefixes, suffixes, or stems
Self-citation directly causes this because:
- When copying an existing word and slightly modifying it, families of related words emerge
- Frequent reuse of certain fragments (e.g., "qo", "ol", "chedy") occurs naturally
- Sequences of similar words appear close together, reflecting recent "copy targets"
Thus, the observable
word patterns — both at the micro (within lines) and macro (across sections) levels — are a direct byproduct of the self-referential text generation process.
4.
Illustrative Analogy
Imagine a program that:
- Starts with a few invented "words": qokeedy, chedy, dain
- At each step:
- Selects a previous word or fragment
- Modifies it slightly (replaces glyphs/adds or removes prefixes)
- Appends the result to the text
Over thousands of iterations:
- Some fragments become dominant
- Similar-looking words proliferate
- Novel sequences appear rarely
- Overall entropy remains low
- The "vocabulary" stabilizes into recognizable word families
Exactly what we observe in the Voynich manuscript.
Conclusion:
The self-citation hypothesis
does explain the most conspicuous aspects of the Voynich manuscript:
- Low entropy emerges naturally through repetitive, constrained copying
- Word patterns arise through systematic, recursive modifications
- No need for an independent mechanism to impose these features
Thus, if the self-citation process is properly constrained and recursive, it not only explains text generation but
necessarily produces the statistical and structural anomalies seen in the manuscript.