Sunday, March 31, 2013

Hashing

  I ran across an issue of some importance several years ago that's gotten little acknowledgement. Basically, its how well a lexicon uses the distinctions amongst its phonemic inventory.  It's a step away from phonotactics.
  Essentially, it makes a little sense to distinguish sounds if they don't contribute to the distinctiveness of words. It's slightly contentious, but some languages test this more strongly than others, in terms of  allophony (some languages' phonemes are very flexible, like in Pirahã). There can be extra ways to distinguish words e.g. syntax, prosody. So the level of redundancy probably varies as well.
  The point tho is that, if the words in your language are, eg CVCV...etc. and you have syllables A, B, C, and D, and a lexicon of a few words using them, e.g. AB and CD, then you can do several things, because they're not utilizing the inventory very well. Maximally, if these are the only 2 words in the language, you actually only need two syllables in your inventory, with this lexicon, eg AA and CC (or any other pair of syllables) maybe even condensing the words to one syllable eg A and B.  Anyways, this process is known as hashing and it was solved for optimality in 1992.
  So I'm not a programmer or anything but its cool. It lets you know how important words are to the sound of the language eg testing the high frequency words and sounds of the language, and testing polysemy, etc. it would be good for designing a shorthand e.g. adapting PLOVER to a language, and its how a stenotype for Japanese uses only 10 keys, basically a home row.

Saturday, March 30, 2013

Internet jargons & slangs

  Wow, it's been so long since my last post! To say I'm busy is part of it, but its mostly with very different things, none to do with language.
  I'm revisiting an old idea of mine, and I've found out, others: collaborative crowd-sourced conlanging. Essentially, how can we create a language together so that we understand it, use it, and develop it, and how do we inculcate English monoglots, e.g. newbies? That seems to be one of the biggest things...English messes up, and waters down everything.
  You see this in Internet slang, where catchphrases go from being catchphrases to slang shibboleths, eventually becoming idioms, then euphemisms, and finally becoming synonyms. That slowly happened to Boontling, the real-life dialect that developed out of some bored children's games in slang in remote California. This happens in countries where English-speakers settle and fail to learn the language, and then everyone bends over backwards to accommodate them until their language fails to attain new speakers and status.
  It's strange because English suffered the Norman invasion in 1066 AC but the people didn't learn the French of the Norman invaders--they kept their English, which was severely bastardized until it reached Middle English, but they apparently didn't care. Then, the King at the time said, "Let us speak French in the court!" and that then and there gave English its new ascendancy.
  How do you make people learn? That is what I've tried to find out, and it seems like you have to do quite a it of the work for them, and they'll cover the difference. Other posts have covered what kind of a system I think would be needed (essentially a news site and message boards and people with a lot of time on their hands).
  What would these people have in common? They would need context to discuss things in a world, and the virtual world doesn't cut it. There have been efforts (e.g. on kickstarter) to create immersive videogames but they look bad, hence they're not immersive. 2nd life and minecraft don't really work well either IMO, tho minecraft is extendable by the users. I'm not aware of whether minecraft has vocal interaction but that would probably be necessary. The thing is, if its public, people don't feel the need to socialize in some cryptolect.
  Anyways, Kalusa the first and really only language like this worked pretty well until someone decided to start stuffing the ballots, but in actuality suffered from the same problems as other conlangs. It had words for things which no-one would ever have the opportunity to speak in public, like for animals. If conlangs were realistic they would have native words for computer guts, Internet chat forms, and the various things you can see/learn about in fictional worlds or pictures. Inevitably, they would come to be like counting the bricks on the inside of a jail cell. That's why we have real languages, so we can go to them.
  It's just that conlanging hasn't touched on what people want most:  money, food, power and sex.  Esperanto, because its totally against hegemony (well, really any 2nd language would be too) throws out power. Not using English throws out power as a motivator, so that's I don't know how many people? It does allow you to potentially get a date if you're not a total boar...in theory. If people have time they're spending on language study they may or may not have a lot of money, and hence enough food. If a language is going to take hold its going to have to have both a significant amount of grammar and work units translated into a target language, without a further incentive to translate more original works.
  Essentially, it has to be a fun game.
  Gary Shannon discussed awhile ago away to change English into another language. That would be pretty fun and workable right? The grammar would be expressed thru a longish article slowly changing the language into the target deformation, gradually building up until the whole thing is totally replaced. At least, that's one way. It could be reinforced with funny limericks, etc.
  This works a lot better if there is a simple pattern which people can extrapolate, e.g. phonetically and grammatically backwards English aka Shilging, which is more than either a game or a relex, and certainly sound fun (hardest thing? reversing the prosody, which might be impossible).
  On the other hand, it seems like people will always move on. And that's not something any language can deal with.