*Sender*: subbarao2z2@gmail.com

0. If I have n propositions, how many possible clauses are there? (Assume that clauses of the form A V A are automatically simplified to A).

Does this help in convincing you that the resolution theoremproving search procedure is *complete* in that it will terminate whether or not

the theorem you are trying to prove actually holds?

1. We saw that propositional logic is monotonic and that real world requried "defeasible" or "non-monotonic" reasoning. Is probabilistic reasoning

monotonic or non-monotonic? Explain.

2. You obviously heard the two words "Probability" and "statistics". What is the difference between them? (or are they basically synonyms?)

3. We made a big point about the need for representing joint distribution compactly. Much of elementary probability/statistics handles

continuous and multi-valued variables, where specifying the distribution of the single variable itself will need a huge number of numbers.

How is this normally side-stepped in elementary probability?

Rao

0. I'm not sure I understand this one. I think I'm confused by the difference between a proposition and a clause. It's my understanding that a proposition would be made up of clauses, but not they're limited to a specific number of clauses. Couldn't a proposition be "there's a 10% chance of X, given A, B and C are true." Is that one proposition? If so, does it have 4 clauses?

ReplyDelete1. Propositional logic is non-monotonic. The tweety bird story from class was an example. Our belief that a particular statement is true can go up or down as we add new facts- thus non-monotonic.

2. Probability is the "belief" that something will happen. There's a 50% chance that if cross the street without looking, I'll get hurt. I'm making a prediction about what will happen in the future. Statistics is broader than just probability though. It includes analysis of probabilities, but also lets us interpret and analyze data, determine sample sizes, establish cause and effect, etc.

3. Do they side-step this by assuming the variables values are normally distributed among possible values, ie, the bell curve?

Regarding 1

ReplyDeleteA proposition is basically a boolean variable (say, P, Q etc).

A literal is a proposition or its negation.

A clause is a disjunction of literals

======

So if I understand correctly, you're asking- if there are 3 propositions, how many total possible clauses, made up of 1, 2 or 3 literals are there.

ReplyDeleteAfter some experimenting, I think the pattern is: the sum from i=1 to n of [(n choose i)*2^i]. So for n=1, we get 2. For n=2 we get 8 (that's single literals plus two literal clauses), for n=3, we get 26 (single literals, two-literal clauses and three literal clauses). And so on.

As this relates to the resolution search theorem- resolution removes a literal from the clause each time we apply it. Since we have a finite number of clauses, we must eventually terminate?

you are basically right (your i should go from 0 to n; what you are getting are 1 (just the empty clause), 3, 9, 27 etc--or in other words 3^n)

ReplyDeleteThere is an easy way to see that the number of clauses is 3^n--in each clause each of the n proposition can

(1) occur with +ve polarity (2) occur with negative polarity (3) not occur. So, we have 3^n possibilities.

And yes, since there are only 3^n clauses, we will eventually derive each of them and terminate.

Rao

ps: By the way, if you are feeling like exercising your combinatorics, you can also see that the terms

(n \choose i) 2^i can be written as

(n \choose i) 2^i 1^(n-i)

So, this summation is just the binomial expansion of

(2+1)^n or 3^n

I believe you are right, Cameron, in that to sidestep the defining many numbers for the joint distribution we assume that the distribution lies along a curve or some function that is determined by fewer numbers. Most of the time it appears to be on a "normal" distribution, i.e. bell curve, where you only need to define the average and the standard deviation to get the distribution which reduces the quantity down to only two numbers. Much easier to work with.

ReplyDeleteAfter we talked about statistics versus probability in class I think I understand statistics, to use data to try to realize the joint distribution, but was probability then using the joint distribution to say what the likelihood of something occurring is, something being in a range of values or just about anything that the joint distribution can say?

Alex:

ReplyDeleteYour understanding is right--we use parameterized distributions to compactly model continuous distributions.

A related question that arises of course is "Why is normal distribution" used so often? What special properties does it have that it becomes applicable in modeling a variety of random phenomena?

Rao