RE: Seely's Views 2

From: Roger G. Olson <rogero@saintjoe.edu>
Date: Fri Sep 03 2004 - 20:19:41 EDT

>
>
>> -----Original Message-----
>> From: Roger G. Olson [mailto:rogero@SAINTJOE.EDU]
>> Sent: Friday, September 03, 2004 4:03 PM
>
>>
>> I kinda see your point, Glenn -- but how could Genesis 1:1 be
>> read in an allegorical way, if in fact Genesis 1-11 is "true"
>> allegory rather than all literal history? Isn't Genesis 1:1
>> compatible with allegory? After all, it doesn't state WHEN
>> "the beginning" was relative to the present.
>> It doesn't say HOW God created the "heavens" (what's that,
>> BTW?) and the earth. How else would God have stated this information?
>>
>> Anyhoo,... can't a text combine aspects of history and
>> allegory? Why does it have to be all or nothing?
>>
>> The problems that the historists have start in Genesis 1:2.
>> That's where the proverbial coprolite hits the alluvial fan.
>
> I think you meant the allegorical fan. :-)
>
> As to a mixture, I am re-reading Shannon's paper which started
> information theory (just having had a debate on the issue.) He has a
> part of the paper which is directly applicable to this issue of what is
> and is not to be taken allegorically and what is and what isn't to be
> taken as history.
>
> God is supposedly trying to communicate with mankind. That makes God the
> source function in information theory. He uses the channel of a written
> book or a story about the creation of the universe. Now, here is what
> Shannon says:
>
> “If a noisy channel is fed by a source there are two statistical
> processes at work: the source and the noise. Thus there are a number of
> entropies that can be calculated. First there is the entropy H(x) of the
> source or of the input to the channel (these will be equal if the
> transmitter is non-singular). The entropy of the output of the channel,
> i.e., the received signal, will be denoted by H(y). In the noiseless
> case H(y) = H(x). The joint entorpy of input and output will by H(xy).
> Finally there are two conditional entropies Hx(y) and Hy(x), the entropy
> of the output when the input is known and conversely. Among these
> quantities we have the relations
>
> H(x,y) = H(x) + Hx(y)= H(y) + Hy(x).
>
> All of these entropies can be measured on a per-second or per-symbol
> basis.
> “If the channel is noisy it is not in general possible to
> reconstruct the original message or the transmitted signal with
> certainty by any operation on the received signal E.” C. E. Shannon, " A
> Mathematical theory of Communication" The Bell System Technical Journal,
> 27(1948):3:379-423, p. 19, 20 at
> http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf
>
>
> He then talks about a 1000 bit per second transmission channel. Which
> has a 1/100 error rate--e.g. a 0 is received as a 1 or a 1 is received
> as a zero.
>
>
> “Evidently the proper correction to apply to the amount of
> information transmitted is the amount of this information which is
> missing in the received signal, or alternatively the uncertainty when we
> have received a signal of what was actually sent. From our previous
> discussion o fentropy as a measure of uncertainty it seems reasonable to
> use the conditional entropy of the message, knowing the received signal,
> as a measure of this missing information. This is indeed the proper
> definition, as we shall see later. Following this idea the rate of
> actual transmission, R, would be obtained by subtracting from the rae of
> production (i.e., the entropy of the source) the average rate of
> conditional entropy.
>
> R = H(x)- Hy(x)
>
> “The conditional entropy Hy(x) will, for convenience, be called the
> equivocation. It measures the averatge ambiguity of the received signal.
> “In the example considered above, if a 0 is received the a
> posteriori probability taht a 0 was transmitted is .99, and that a 1 was
> transmitted is 0.1. These figures are reversed if a 1 is received. Hence
>
>
> Hy(x)= -[.99log.99+0.01log0.01]
> =.081 bits/symbol
> or 81 bits per second. We may say that the system is transmitting at a
> rate of 1000-81=919 bits per second. In the extreme case where a 0 is
> equally likely to be received as a 0 or 1 and similarly for 1, the
> aposteriori probabiltieis are ,1/2 and
>
> Hy(x) = -[1/2log1/2 +1/2log1/2]
> = 1 bit per symbol
>
> or 1000 bits per second. The rate of transmission is then 0 as it should
> be.”
> C. E. Shannon, " A Mathematical theory of Communication" The Bell System
> Technical Journal, 27(1948):3:379-423, p. 20 at
> http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf
>
> Now if we have a situation in the God-communicates-to-humans-system a
> case where we have every single part that we can't tell whether it is
> meant to be taken as historically true or historically false, then we
> are in the situation at the last case. The probability for each part is
> 1/2 true and 1/2 false. Then Hy(x) equals 1 bit per symbol or the rate
> of transmission of the divinely inspired message is zero.
>
> That is what is wrong with the allegorical and mixture approach.
> Communication becomes impossible and I will cite the noisy channel
> theorem of Shannon! (How is that for applying science to theology?) It
> is clear that christians of various stripes see various things as true
> and various things as 'allegorical'. Bultmann, [sic?] I think felt that
> even Jesus as the Christ was allegorical. Jesus was a man and the Spirit
> of God abandoned the poor fellow on the cross. Because of this, if the
> story is simply given a 50-50 chance of being right, there is no
> communication at all by God. All is random.
>
> In geophysics we would call this the case where the signal to noise
> ratio is 1. I try to record a seismic signal from reflections off of
> rock layers. But if the wind is blowing strong enough in some places,
> the tree roots act as a seismic signal. When that signal is as great as
> my seismic source, the signal to noise ratio is 1. Half the samples we
> record in that situation are correct. Half are false. We don't know
> which is which. We can't unscramble truth from fiction.
>
> Making the Bible have a signal to noise ratio of 1 we won't be able to
> separate truth from fiction.
>
>
>
>
>

Interesting analysis, Glennn -- muchly appreciated! I'm aware of your
recent engagement with a YEC ("Jezz") on theologyweb.com on the subject of
information theory. That is a very interesting thread.

I'm wondering what the analog of informational "noise" is in the system of
divine communication to humanity via scripture? Do you mean that Bronze
and Iron Age yuman beans misunderstood direct inspiration from God? Or
can't this "noise" be equated with an accomodation of God to the (very
limited) knowledge of the people to whom He spoke 2.5+ Ka?

Too bad I'm such an ignoramus on this subject, I'd like to contribute more
to this discussion. All I care about is trying to the best of my ability
to understand the Truth, which, alas, in toto evades us all this side of
Glory.

In God's Peace,

Roger

-- 
Received on Fri Sep 3 20:48:46 2004

This archive was generated by hypermail 2.1.8 : Fri Sep 03 2004 - 20:48:46 EDT