LanguageNews.net

Language & linguistics news from around the world • Updated daily • Relevant stories, selected by humans (not computers)

Share this site!

News Item

  • May 18, 2011
  • 02:37 PM

Entropy is Universal Rule of Language

Lisa Grossman / Wired News
The amount of information carried in the arrangement of words is the same across all languages, even languages that aren’t related to each other. This consistency could hint at a single common ancestral language, or universal features of how human brains process speech.

Add comment

Will not be published or shared

Note: Comments will appear on the site only after they are approved.

Languages

Categories