Post by DKleinecke Post by Peter T. Daniels Post by Arnaud Fournet
How many times do you need to be repeated that your insane crap has no connection whatsoever with linguitics.
Not a possible "dative passive" -- "How many times does it have to be
repeated to you that ..."
Unlike "Pete was given a warning by Arnaud."
Like many things PO imagines his proposals to be
linguistic because he wishes to remodel natural language
into his computer-language-like ideal.
See the Tarski Undefinability proof on pages 275-276
AI can never make very much progress Until Tarski Undefinability is refuted and a mathematical model of ∀x True(x) is completed.
True(x) is the ultimate anchor of all truth conditional semantics.
Truth conditional semantics unifies the formalization of natural language semantics into one a single goal.
Since Tarski Undefinability is based on Gödel Incompleteness refuting either one refutes them both.
A world class expert coached me on adapting my very simple formal expression
of the 1931 incompleteness theorem so that it would be clearer and more correct.
They added the qualification that F must be at least as expressive as Robinson arithmetic.
See: [Refuting Incompleteness and Undefinability Version(13) (World class expert coaching)]
This may be my final correct refutation of Gödel's 1931 Incompleteness Theorem.
Post by DKleinecke
He points to what is called Formal Semantics (which seems
to be the descendant of Montague Grammar) as justifying
his opinions. So far no member of the small (?) Formal
Semantics community has spoken out about whether they
acknowledge what he does as authentic. His grasp of formal
logic is so weak that I doubt that they would.
The formal semantics of linguistics merely has to work consistently and correctly there is no requirement that it be based on anything besides this.
Until ∀x True(x) is formalized consistently and correctly there is no hope of formalizing natural language consistent and correctly.
Only these guys are going in the right direction
All of the rest of AI research it focusing on making systems that are great at playing video games and clueless about the meaning of words.
Even IBM's Watson only could make educated guesses and had no deep understanding.