Talk:Information theory

From RationalWiki
Jump to navigation Jump to search
Icon science.svg

This Mathematics related article has been awarded BRONZE status for quality. It's getting there, but could be better with improvement. See RationalWiki:Article rating for more information.

Copperbrain.png

Wow, I can't type. Gotta go, though. Sterilewalkie-talkie 20:21, 23 April 2009 (EDT)

Thanks for a great start! ħumanUser talk:Human 20:27, 23 April 2009 (EDT)
Does that mean I can brag that I wrote the article, the way Ed does about his? --Kels 20:52, 23 April 2009 (EDT)
Of course. ħumanUser talk:Human 21:08, 23 April 2009 (EDT)
Bow down before me, mortals, for I have created yet another article! --User #7 21:14, 23 April 2009 (EDT)
As time goes by, you've "written" a pretty good article... nice "work"! ħumanUser talk:Human 04:08, 8 May 2009 (UTC)
Feature, perhaps? It seems to be rapidly approaching that quality. --User #7 00:14 8 May 2009 (UTC)
Let's just replace the main page with it, its glory is so profound... ħumanUser talk:Human 04:44, 8 May 2009 (UTC)
I still have some thoughts for improovements. Time, however, is short. Sterile ax murderer 01:48, 11 May 2009 (UTC)

outdent I will be futzing with this a bit, since I see some issues in the clarity of the first paragraph. Bear with me, and jump in any time. Sprocket J Cogswell (talk) 00:32, 13 July 2009 (UTC)

Please do. The first paragraph was different before, and I realize now was and still problematic. Sterile ax murderer 00:43, 13 July 2009 (UTC)
More squalor than scholar here, but I think there is grist to be milled in the notion that the genome is way bigger than it needs to be. Not sure this can be called redundancy in any sense that it would be useful for error-correction. Given a simple 4-state coding method and the mechanisms of gene expression (not by general-purpose computer nor more specialized "hardware," but by the natural molecular tropisms of biochemistry as it has come to be...) the system seems to have arrived at something like commenting out AFOAL of the available text. Hmmm... Sprocket J Cogswell (talk) 01:55, 13 July 2009 (UTC)

matches[edit]

Sorry if this is irrelevant but I'm on a verry slow connection. My take on the matches:information thing:

Information contained in a dropped number of matches:

  1. the number of matches
  2. the height from which they were dropped
  3. the mass of each match
  4. the elasticity of the matches
  5. the elasticity of the surface they were dropped on
  6. the value of local gravity
all these, and more, would have an effect on the final positions of the matches and could be determined by sufficiently rigorous examination.

Information contained in a placed number of matches.

  1. the intent of the placer
nuff said. This message brought to you by: Toastrespondand honey 01:09, 13 July 2009 (UTC)
Oh, totally. PJR just had an info-gasm because he spelled a word, and hence the matches have "meaning" in a linguistic sense, and an intelligent source (I suppose you can call Philip that) generated it. What it has to do with genetics is unclear because we've never observed anyone naturally generating DNA sequences and the sequences don't "mean" anything in that sense. False comparison. If the proteins that the DNA encodes started spelling words, then we could talk. Sterile ax murderer 01:17, 13 July 2009 (UTC)
As I said elsewhere, except for one slightly misplaced match (I will leave it to the reader to determine which one), the "random match" picture PJR uploaded is actually a proof of Fermat's Last Theorem. He fooled us again! ħumanUser talk:Human 02:55, 13 July 2009 (UTC)

Eggsellent article[edit]

Having read (eventually) this, I may say that it's one of our very best articles. Featured & cover and any other praise should be heaped on it.

one typo: early on:

" Entropy increases in thermodynamics in a closed system according to the Second Law, but it is unclear what a closed system is <> genetic information."

seems to be a word or two missing where I've put "<>" ? This message brought to you by: Toastrespondand honey 01:43, 13 July 2009 (UTC)

Attempted to fix it. Sprocket J Cogswell (talk) 02:19, 13 July 2009 (UTC)
I'm not sure what you want where the cit. needed tag is placed. Is there such a thing as a "closed system" in information theory? Sterile ax murderer 02:29, 13 July 2009 (UTC)
Essentially it is a sly little indication that I acknowledge myself to be talking out of my ass there have made that sentence up out of whole cloth. How could there be such a thing as a "closed system" in information theory? If I understand the universe of discourse correctly, it consists of all possible messages, and a process of at least five steps for getting a message across:
  1. Idea in sender's mind
  2. Encoding method(s) (may be more than one layer of encoding and decoding)
  3. Channel, which has a certain capacity which may be further limited by noise
  4. Decoding method(s)
  5. Idea in receiver's mind
WTF does any of this have to do with thermodynamics, other than that entropy and information are both described logarithmically? Sprocket J Cogswell (talk) 02:42, 13 July 2009 (UTC)

Hopefully it's better now. PS: Info theory actually says little about the idea in the sender's head or the idea in the receiver's mind, in my semi-slightly informed notion of it. It's more 2-4. Sterile ax murderer 02:52, 13 July 2009 (UTC)

Seems better now. The whole "idea" thing may be my own twisted interpretation of what the prototypical message represents. In human terms, that is what communication is about, but I don't suppose it takes a sentient life-form to generate messages that make a difference in the world. As I understand it, "information" is that which could not have been predicted before the message arrived, and is somewhat related to what difference the message made. Been a long time since I did any reading up on it. Sprocket J Cogswell (talk) 02:58, 13 July 2009 (UTC)
There still needs to be an original message for the encoder(s) to work on, and a received (more or less faithful) copy of it put out from the decoder(s), or else there is no way to evaluate steps 2~4. Perhaps
  1. Original message
  2. Encoding method(s) (may be more than one layer of encoding and decoding)
  3. Channel, which has a certain capacity which may be further limited by noise
  4. Decoding method(s)
  5. Received copy
is of more general application. Sprocket J Cogswell (talk) 03:09, 13 July 2009 (UTC)
What you're talking about is messages.
Quite honestly, I think you're both wrong; information is that which allows the past and future of a given situation to be deduced to a greater or lesser extent. The present condition of, say the sun, contains information that allows its previous and subsequent condition to be ascertained. The analysis of that information is another matter. There is no requirement for any message i.e. sender/receiver. Information is sufficient unto itself. (I like that!)
Someone once said that (paraphrase) "given the tools and the knowledge, the entire universe could be deduced from a single drop of water". That's information - no need for a sender or a receiver, just the facts. This message brought to you by: Toastrespondand honey
Your proposed definition has its merits, but goes beyond the bare minimum needed to work with information itself, shorn of meaning. Obviously, info is interesting to us precisely because it does not swim around in a meaningless vacuum, but information theory and semiotics cover different bits of ground. I have fallen asleep reading Wittgenstein. I'd much rather spend my reading time with more of George Lakoff's stuff. Both are heavy going, but George "speaks to my condition" better than Ludwig.
The definition of "information" used in the theory of Shannon and subsequent workers may be informally stated as "the part of the message that was not predictable prior to its receipt." I am paraphrasing Shannon's words from memory here. As I recall, Shannon and Weaver's book, The Mathematical Theory of Communication is a slim volume, about the size of Harry Frankfurt's On Bullshit. Not trying to say anything cute here, honestly, just sharing the recollection of holding each one open in front of me. Both books are reasonably accessible to the interested lay reader. Sprocket J Cogswell (talk) 12:35, 13 July 2009 (UTC)
and an afterthought: The part of the message that was predictable may be called redundancy, in the vocabulary of information theory. Redundancy is useful, since a sufficiency of it allows uncorrupted transmission, and one of the uses of this whole information theory thingy has to do with designing slick ways to apply just enough redundancy to maintain acceptable fidelity through a noisy channel. Information and redundancy are not easily separated; one cannot sort data bits into two buckets, since each bit carries some of both. Sprocket J Cogswell (talk) 12:46, 13 July 2009 (UTC)
A useful blogpost (which I think is wrong about a few issues, but is generally informative) is here. I have more to write about this, but must work now! Sterile ax murderer 12:55, 13 July 2009 (UTC)
I may have been too hasty denying a connection between information and thermodynamic entropy. Sprocket J Cogswell (talk) 23:48, 13 July 2009 (UTC)
I think so I've always though of it as an inverse relationship: assuming a permanently expanding universe, the ultimate situation will be maximum entropy and minimum information - just a sea of photons going nowhere. Of course there might be some information already existing, but not yet found, which will change that view - that's what sience is all about. This message brought to you by: Toastrespondand honey 00:33, 14 July 2009 (UTC)

As for your WP reference, I'm not sure if I understand it, really. I think it is still saying that mathematically info and energy have the same form and, as information entropy is more general than energy entropy, enrgy can be thought of as a subset or an "application." Is that your interpretation? (I still don't know if the second law applies, for example, from that paragraph, but I can't see how it could as information entropy is a definition. The second law for information entropy would require some conditions, like the closed system in the thermodynamic sense....) Sterile ax murderer 01:46, 14 July 2009 (UTC)

Hell, even if were identical, the universe may be closed, but as with energy, Earth isn't. After all, I=Ec2, right? Or is it I=m/c2? ħumanUser talk:Human 01:56, 14 July 2009 (UTC)
The main article of that WP section is crying out for expert attention. No expert, I'm willing to let the jury stay out on that, but I'm not willing to say the two have nothing in common. Sprocket J Cogswell (talk) 02:23, 14 July 2009 (UTC)

Information<>message[edit]

All the talk here seems to be of information as part of a communication. It all seems a bit "quantum" to me, as if the information does not exist until someone tries to access it. I maintain that there is information in everything: The information in a single atom, or even a subatomic particle is that which distinguishes it from other atoms or particles. A conglomeration of atoms contains all the information of its individual atoms, plus that which distinguishes it from all other conglomerations including hints of how it came to be. The fact that this information might never be "read" is immaterial: it's still there. The unpredictable 'quality' of information is what I'm talking about - it enables one piece of spacetime to be differentiated from another. There is no need for there to be an overt message, I supposee an atom or moecule could be the message and then its properties would be its information which would affect its interactions with other atoms or molecules but there is no need for a sender or recpient whch is what seems to be implicit in the article. This message brought to you by: Toastrespondand honey 15:42, 13 July 2009 (UTC)

Shannon's 1948 paper was published by Bell Labs, whose customers were senders and recipients of messages. One of the things that appeals to me about Lakoff's work in linguistics is the notion of embodiment, which I take to mean that our language, and hence to a great extent our ideas, are grounded in our bodily experience. To oversimplify, I look "over" a new chapter because in most cases I've got the book or the screen somewhere around chest level, and am glancing down at it, and so on.
Metaphysically, who cares about information without any possibility of an observer? Does it exist only in the mind of G*d? That is ground I don't care to tread. Some folk might wish that our ideas were the pure stuff of poetry or praise or other ineffable aetheric non-substance, but how can one deny that our thoughts, speculations and fantasies take shape thanks to the effable bodies we inhabit? Sprocket J Cogswell (talk) 23:36, 13 July 2009 (UTC)
Quite frankly, I don't care what Shannon said. Information that is not accessed is still information and may be accessed at some future time. I think that all this message stuff plays into the hands of the creationists by allowing them to posit a sender (and a presumed receiver). Science is the "art" of reading the inherent information in the universe and applying it to read further. No need for any message - it's just there whether anyone reads it or not. This message brought to you by: Toastrespondand honey 00:17, 14 July 2009 (UTC)
"Shannon's 1948 paper was published by Bell Labs, whose customers were senders and recipients of messages" - exactly. He was working on methods of transmitting messages intact, with as economical a system as possible. I suspect he was not working as a "philosopher of information" so much as an engineer. Modern example of getting the job done - how many bits per word and samples per second are required to send and receive intelligible speech? Music? Movies? One interesting result is that the telephone system severely limits the bandwidth of telecons, since only a few octaves are required for legible, recognizable human speech, and bandwidth is at a premium when sending lots of messages over a network. This whole "message" thing is, I suspect, spurious to the idea of "information" being a "quantity" in the universe like matter and energy. ħumanUser talk:Human 01:09, 14 July 2009 (UTC)
Interestingly enough, intelligibility takes less information capacity than speaker recognition. Especially when using a limited specialized vocabulary, human speech can be understood through quite a bit of noise, be it roaring wind or agitated electrons. Decades ago I worked on a VHF cellular network that someone might use to say things like, "This is Harry- we're doing some routine maintenance, so push the big red button now please."
The customer was interested in having enough fidelity (= noise-free bandwidth) that the fellow at the button could hear that it was the voice of Harry making the request, and not some random Gary or Igor. Sprocket J Cogswell (talk) 14:28, 16 July 2009 (UTC)

I agree. Information theory deals with the transfer of data (or information, whatever) and how it is transferred or compressed. The information could be something apparently "random" in the everyday sense of the word: The static on channel 101 on my TV every night, recorded sequentially, and then passed a phone line. Even though there is no "message" in my static, can the static be transferred with fidelity? (Natural selection preserves information, in the event of what does this have to do with biology....) And it while there has to be a source and and destination, there does not need to be a human detector. Sterile ax murderer 01:40, 14 July 2009 (UTC)

Toast, is "the universe is its own description" fairly consistent with what you have said?
Description and encoding are central to the concept of information as I understand it. I'm not sure how useful it is to consider information as if it were a thing with any existence apart from its expression at some destination. Sprocket J Cogswell (talk) 02:20, 14 July 2009 (UTC)
Again, you're talking about messages, the fact that they contain information is secondary to you (?). There is information in eveything even if it's never accesed. There is information in the message which does not relate to the intended information - there will even be information regarding the degradation - not necessarily overly there.

I really can't keep this up on this phone - sorry This message brought to you by: Toastrespondand honey 02:55, 14 July 2009 (UTC)

While I think I see what you're getting at, I can't get my hands on it, not sure what to do with it. Might as well try to taste the akashic fluid, count angels dancing on a pinpoint, or fan the luminiferous aether around by waving my hands. Better still, I'd like to keep some phlogiston in a jelly jar for special occasions when there's a pudding to be flamed. In my little world, information is useful when it shows up somewhere as news. For just one example, I believe it is constantly manifesting in every cell of my body, resulting in fresh arrangements of amino acids. Just now there is plenty of information arriving at my eyes, ears, fingertips, and so on. I happen to think that is where I need to focus. Be well, Sprocket J Cogswell (talk) 09:34, 14 July 2009 (UTC)

Did anybody mention yet...[edit]

...that IT'S JUST A THEORY !!!!!!11¡¡¡¡¡999–·–·–·–·69·–·–·–∞∞∞ --I'm bored (talk) 21:14, 6 December 2010 (UTC)

For a layman?[edit]

I don't understand what information REALLY is... An atom has mass, energy and momentum right? So what is its information?

Furthermore, does anyone have a clue about these two books?

http://www.thenatureofinformation.net/index.html

And

http://www.amazon.com/Information-The-New-Language-Science/dp/0674018575

Are they woo, or semiwoo, or science?

If we live in a material universe (energy/mass), what is information? Is it a real "thing" or is it only a description of the possibilty of states?Gianga23 (talk) 16:15, 11 June 2014 (UTC)

The website looks like some woo in that it incorporates spirituality into information. The book on Amazon? Probably science. Information, in the sense of information theory, is a bit hard to grasp. I know it from statistics where, essentially, I compare the amount of variation in a statistical model to the amount of variation in raw data (using things like -2LL and other technobabble). If the two match (and they won't), I've captured everything about the data that is possible -- all the information. I've heard a compression analogy used where the information content in a XX-bit image is the smallest size, YY-bits, it can be compressed to, without degrading the image quality -- essentially, the maximum lossless compression. As for physics and cosmology, I'm a bit less clear. MarmotHead (talk) 16:32, 11 June 2014 (UTC)
(EC) See this WP articleWikipedia. The first book is bunk, but the second book seems decent. - Grant (talk) 16:33, 11 June 2014 (UTC)
While you're posting this question on the talk page of an information theory article (not a physical information article), I assume from your questions that you're referring to the tighter physical definition. - Grant (talk) 16:36, 11 June 2014 (UTC)