Koen G > 14-09-2019, 05:51 PM
(14-09-2019, 05:35 PM)Anton Wrote: You are not allowed to view links. Register or Login to view.As I said yesterday in I don't remember which statistics thread, I think at least 2 orders of magnitude more than the vocabulary size. But, once again, you can explore that by taking a large text of, say, 100000 words, and calculate entropy over a fragment thereof, say, of first 10000 words (note that its vocabulary would probably be less than that of the whole text). Then take a larger fragment of, say, 20000 words. And so on. And observe how the values change.
Koen G > 14-09-2019, 08:20 PM
Anton > 14-09-2019, 08:28 PM
Koen G > 14-09-2019, 09:09 PM
Anton > 14-09-2019, 09:19 PM
Koen G > 14-09-2019, 09:40 PM
Koen G > 14-09-2019, 10:29 PM
Anton > 14-09-2019, 11:46 PM
ReneZ > 15-09-2019, 04:37 AM
ReneZ > 15-09-2019, 04:52 AM
(14-09-2019, 08:51 AM)nablator Wrote: You are not allowed to view links. Register or Login to view.(Values for entire TT with unclear "?" vords removed):
h0 h1 h2
12.97 10.45 4.36
When vords are randomly shuffled h2 increases:
12.97 10.45 4.52