Collaborative articulation of how abstraction and language is employed in the computational manifestation of numbers -- including analysis of the role of syntax, semantics, and meaning in the specification and use of software interfaces.
The nfoCentrale Blog Conclave
nfoCentrale Associated Sites
The nfoCentrale blogs, including Numbering Peano, were published through Blogger via FTP transfer to my web sites. That service is ending.
As part of the migration, I am republishing this blog in the latest stable template format.
Then there will be silence as Blogger is unhooked, although the pages will remain.
No new posts or comments will work until I updated the web site to use its own blog engine. Once that migration is completed, posting will resume here, with details about what to know about the transition and any breakage that remains to be repaired.
Meanwhile, if you are curious to watch how this works out, check on Spanner Wingnut’s Muddleware Lab. It may be in various stages of disrepair, but that blog will come under new custodianship first.
Labels: blog development
While I think and post mainly about the theoretical abstractions that are important in getting Miser and the Frugal language done properly, I am also thinking ahead to the way the Miser engine can be delivered as a platform component (Windows DLL, .NET assembly, etc.) that hosts Miser on behalf of a software application. I have in mind that any kind of Frugalese user interface for a console application, including any interactive development environment (IDE), would be built atop a hosted oMiser engine as an important first case.
Because the Miser model is platform independent, it would be nice to have a platform-independent way to save, load, and interchange Miser “codes” in a neutral, highly-interchangeable format.
XML comes to mind. Although XML is not a compact binary format, it is a well-known, standard format for which a great number of problems are already solved, including versioning, the ability to deal with extensions gracefully, the complete hiding of the storage structure, and harmony with my intention that the Miser abstraction be universally comprehended. (That is to say, it shall seem as if there is only one oMiser in the world, and the fact that implementations are distributed and replicated is as invisible as can be practically achieved.) In the oMiser case, the XML stream would also be highly compressible using widely-available compression techniques.
The interchangeable “object code” among all oMiser implementations will be standardized in an XML stream.
I propose to abuse the purity of XML for this purpose. The XML will be well-formed but it won’t be the way XML would be naturally used to communicate a persistent format for oMiser Obs.
My basic plan is that the loadable Ob will be coded in a reverse-Polish notation conveyed in XML. The notation will have a syntactic structure that conforms to a BNF grammar such as
allowing the bottom-up build-up of an internal Ob representation as the XML stream is being read. Not every Ob operation needs to be implemented, only those needed for constructing constant Obs:
XML streams for the loadable format will be supported at the core level of an oMiser implementation. Given a handle on an Ob, there is an interface that will emit the XML stream for persistent representation of that Ob for any interchange purpose. Likewise, there is a low-level interface on every oMiser engine by which an XML stream is accepted and a handle for the corresponding Ob returned for use in the hosting application.
The XML representation will be flat. That is, the <oMiser:ob> XML element contains a flat sequence of <term />, <un-op />, and <bin-op /> elements. It is easy to process such a flat structure, and it is easy to ensure that such a structure is well-formed according to the grammar for〈ob〉given above.
There are three interesting challenges for this structure. These need to be explored to ensure that the choice of a reverse-polish Ob-loading is as practical as I intend for it to be:
I was reminded of all of this when I recently read the 2009-09-29 Rick Jelliffe post on context-free XML. The shoe-horning of the context-free reverse-Polish flat stream for〈ob〉as content for an XML <oMiser:ob> element creates a context-sensitive problem for expression in some sort of XML schema. This post is intended to elaborate on the comment that Rick’s post inspired from me. I trust this is useful context for that. (As usual, your mileage may vary.)
Discussing oMiser implementation considerations is long overdue in any case.
Technorati Tags: Alan Turing, Turing Machine, Turing Test, Bletchley Park, Turing Centenary, computation theory
I just learned that July 23, 1912 is Alan Turing’s birth date.
There will be a centenary celebration throughout 2012 with UK events held in Cambridge, Manchester, and Bletchley Park. There are already many interesting links on the site and among the Turing Centenary Advisory Committee.
Bletchley Park, where the work on cryptography was instrumental in the conduct of World War II, has fallen into disrepair and neglect. One hopes that renewed attention on the life of Alan Turing may bolster support through 2012 and beyond.
[I’m thinking maybe this is an interesting way to finally meet-up with Charles Petzold. I might even have more to offer on insights about computation that I have gained from the Miser Project by then. This year is the time to renew my attention on that.]
Technorati Tags: formalized theories, interpretations of theories, FOL=, first-order logic, equational identities, identification, distinction
[update 2008-06-17T16:08Z: I added t0' because I couldn't stand not to show the better refactoring of t0. I added mention of equivalence classes in an interpretation and also came up with conditions other than invalidity for finding an interpretation unusable.]
In The Logic of Ot, I said that I would use informal expressions of Ot, the logical theory that applies to Miser Obs. Now that there has been some use of the special characters and notations of First-Order Logic with equality, I want to take advantage of that to talk about interpretations of identity in models of Ot. The ability to identify and distinguish has great bearing on computational systems, and identity as an interpretation is particularly useful to explore.
= as Equivalence Relation
With FOL=, identity and the relational operator, "=", are taken as given, and the following hold:
The first three are the common properties of equivalence relationships: "=" is reflexive (=1), symmetrical (=2), and transitive (=3). The final condition is essentially the definition of "≠" in terms of "=".
What Are We Talking About?
In a first-order logical theory, the variables (x, y, and z as seen in ∀x∀y∀z) are understood to refer to objects in the domain of discourse. We only know what there is to know about that domain from the introduction of constants and expression of conditions that are theoretically required to be satisfied over that domain. For FOL=, we are given an equivalence relation (expressed with the symbol "="), which tells us very few things about conditions under which variables can be taken as referring to the same object of the domain of discourse.
It should be apparent that having "=" doesn't tell us much about the theoretical objects, although it is more than not having "=" (and its partner, "≠").
An intended interpretation could well be that objects in the domain of discourse be identifiable (we can tell when we are referring to the same one) and discernable (we can tell when we aren't). Let's see how that might work.
A Tiny Domain
To provide some practice with ideas of practical interpretation, consider the logical theory obtained by adding the following conditions:
The intended reading of this is as
which is to say, there are exactly three objects in the domain of discourse.
If we only have that one additional condition (t0) in our logical theory, we know nothing beyond that.
Notice that we have not labeled the three theoretical objects in any way. All we have provided for is that there be exactly three.
It will be useful to appear to be more specific by naming them:
Here, A, B, and C are constants for objects in the domain of discourse. We haven't provided much more than what (t0) assures us of, although if there is more to say about the ways that the three objects differ, having the constants to refer to may be useful.
The first interpretation will be in terms of numbers. Assume the system of numbers and arithmetic. Take that system as separate from the logical theory consisting of FOL= plus t0 and t1. (t2 is actually a consequence of that much.)
One interpretation of our tiny theory in number theory would be by saying that the interpretation of A is any n < 0, the interpretation of B is 0, and the interpretation of C is any n > 0. We could be specific, say, with interpretation of A as -1, B as 0, and C as +1. We could also say that A is all n < 0, B is 0 only, and C is all n > 0. That is, A, B, and C correspond to distinct classes. Since they have no members in common, these are known as equivalence classes. We'll explore that further when we return to exploration of Ot.
It doesn't matter, here, how the interpretation is chosen, so long as, having made it, we stick to it. The system in which we make the interpretation is a model (in the loose sense of Reality is the Model) provided that all deductions in our logical theory hold in the interpretation.
Because the logical theory says nothing about aspects of the model that are not accounted for in the logical theory, those matters are irrelevant to the conditions of the logical theory. It does not matter how many different ways the interpretation could be made in the model, so long as when one is made, the logical theory is seen to hold for the interpretation. There is a common fallacy involving reasoning about extra-theoretical characteristics of the model to argue that the theory is incorrect or inapplicable, when the disagreement is more-appropriately viewed as one over choice of interpretation. It helps to carefully separate the theory from its interpretations and models to avoid that pitfall. The abstract theory and the logical formalism is helpful in that regard, even if it feels quite unnatural.
"="/"≠" Are Interpreted Too
It is easy to overlook one important feature of an interpretation of our tiny theory: There must be an interpretation for "="/"≠" in the model as well. That comes along too easily in our choice of interpretations in the system of numbers and arithmetic and it is easy to overlook. When we dig into computational systems and the details of Miser, the ability to discriminate "="/"≠" in particular interpretations becomes very important.
Finally, as an interpretation in reality: let A be earth, B be wind, and C be fire. This is a questionable interpretation quite apart from the omission of water. The difficulty is assuring that these are cleanly distinguishable concepts. What do we do with flaming molten lava and the sucking wind of a forest fire?. We will stumble here at least in an effort to have well-determined "="/"≠" and understandable communication of the conditions that others can accept and apply. The simple, practical conclusion may be that the interpretation is invalid (or simply meaningless/useless) and the theory is inapplicable in that case.
In other cases, a certain conceptual sloppiness, if carefully circumscribed, may be tolerable in having useful interpretations in reality. It remains to be seen whether that is ever very workable.
Technorati Tags: Charles Petzold, The Annotated Turing, Alan Turing, Andrew Hodges, Keith Devlin, Mathematical Realism
I pre-ordered my copy of Charles Petzold's The Annotated Turing on November 22, 2007. On May 24, I had to authorize a delay in the estimated ship date. I waited patiently. When I saw that Petzold had his copies, I wondered if my order would fall through the cracks. The amazon.com site listed the book as in stock, but I had no word. That was resolved this Monday, June 9, when I received notice of shipment. The book arrived two days later by postal mail.
The problem with actually having The Annotated Turing in my possession is deciding when to start and clearing the time to do it. I did start reading at the end of the book, and I have nosed into a few other sections. Naturally, the book arrived at a moment when all of my projects are behind and I am already starting an important new one. A systematic reading is yet to come. I know I will love it if only for the historical threads and connections that Petzold traces in the book.
As part of the tracing of connections, Petzold has been reading One to Nine by Turing's biographer, Andrew Hodges. There are numerous connections traced there, and I like it that Petzold finds himself arguing with Hodges as he works through the book.
Yesterday, Petzold comments on Hodges' objection to memorization of arithmetic with recognition of his own experience in learning the multiplication tables. The interesting idiosyncrasy is how Petzold failed to have automatic memory of certain multiplication combinations and he would solve those cases by algebraic deduction when needed. That resonated for me. There are many cases where I did not remember a rule, but I could and did recreate it on demand. I also share Petzold's having done that long after simply memorizing the result would have been more productive. (This shows up in other activities of mine too, including re-inspection of already-written code to remind myself that it is sound and what the context is before adding more to it.)
I wonder how much this ability to have abstracted an applicable principle (in my case, remembering the times-11 and times-12 cases in terms of times-10 plus times-1 or times-2) leads to algebraic facility and the handy use of identities and mathematical induction well before I developed anything like a fundamental understanding of number theory over the course of my adult years. I recall re-derivation as being valuable in test-taking and yet it is not as direct as having embodied the result for immediate availability.
I can't tell you how many times I have verified for myself what the correct formula for the sum of the first n integers is by redoing the constructive derivation. My doubt is always between n(n+1)/2 and n(n-1)/2 and it is, of course [easy for me to say], the former. I say that not because I have memorized it but because I know how to tell quickly another way, the first way I ever saw it "proved." I suspect that I have just sped that up for myself by looking at it anew this time. Then there's the one about the sum of the first n powers of 2 and what it looks like in binary, etc. I suspect that our diminished respect for the teaching of arithmetic and how to verify arithmetic results is causing trouble for students and their teachers when it is time to approach algebra where one can't avoid dealing with ratios and fractions by using a calculator.
Seeing this latest post from Petzold has me thinking of the connection with a recent paper by Keith Devlin (via Richard Zack), "The Useful and Reliable Illusion of Reality in Mathematics." Two connections that come to mind: how we might come to exercise our capacity for abstract, conceptual thinking as we develop our facility with language, and the tendency to see mathematical conceptions as real. In the second case, Petzold has observed that Turing's machine was his idea for a "real" computer, and I am surprised by that. There are deeper connections in the Devlin paper with how we end up regarding mathematical objects, and that is worthy of separate discussion with regard to what makes computers so successful and so devilishly difficult to deal with.
Technorati Tags: physics, formalized theories, interpretations of theories, models of theories, reductionism, empiricism
[cross-posted to Orcmid's Lair. This is at a level of abstract speculation that is more appropriate here than there. However, I would like a broader audience, and reactions, to what strikes me as having practical importance in how we develop successful computer-based systems.]
During my regular Tuesday buddy call with colleague Bill Anderson, it suddenly occurred to me that I could account for reductionism, an error that scientists and others (software technologists and their masters, for example) make. It is all captured in the following statement:
I don't recall what prompted my exclamation on the subject during the call, unless it was something about how objectionable "code is the model," "code rules," and other developer slogans are, where the implementation of something becomes the specification, denying us access to any useful answer to the question "implementation of what?"
Now, it is not a new thing for Bill and I to be discussing these issues, including this view of the role of theory. What hadn't landed so sharply was how viewing theories as models for reality is the very pitfall that engenders reductionism.
There's much more careful development required as part of arguing for the usefulness of "reality is the model." I have been looking at that in a setting where I am conducting a theory-driven implementation of some software in the Miser Project. Bill and I discuss this even more where it matters a lot to information technology, in contrasting What Computers Know with what Programmers Know to Do and how there is an essential gap between how computer systems are built, the requirements that those systems are meant to satisfy, and the world of opportunity in which those systems are instruments of human purpose.
For now, I want to look at the statement in the context of how we appear to arrive at theories and then apply them as given.
A word of warning: the value of "reality is the model" is not that it is "true." The value to be found is in having a more useful and powerful way of looking at what we do with theories in contrast with the limitations of imagining our theories to be modeling reality.
Where Theories Come From
There seems little doubt that theories started out as explanations of the regularity in our experience of reality, of the world. Some of these theories were, and still are, very pre-scientific (as theories about theorizing might be as well, and that won't stop me).
At some point in the course of the scientific revolution, say around 1600, typified by the work of Frances Bacon, there was an important move to development of scientific theories via inductive generalization from observations of nature, not deduction from some principles of cause. The reliance on experimental confirmation and empirical observation became important. A consistent case of contradictory results could show where the theory is inapplicable or even completely incorrect. One risk is that expression of a generalized (abstracted) theory might be taken as an explanation of the nature of nature as in "objects at rest tend to remain at rest."
Emergence of mathematical sciences, illustrated in the achievements of Isaac Newton, had a profound impact. It permitted the deduction of consequences by calculation or proof, and it permitted the experimental confirmation of those deductions by natural experiments. Notice, however, that the deduction occurs inside the theory, as it were, and the correspondence of the conclusion with reality is an empirical matter.
Theories on Their Own
The mathematical formulation of important theories, and the computational applications of those theories, are removed from reality. Once we are operating in the formal, mathematical theory of a science, there is no reality there. The connection to the reality is accomplished by our interpretation of the mathematical theory as being about reality. Being about something, especially reality, is not a feature of mathematical systems. Being about something is how we interpret results in the mathematical formulation as applying to the world in accord with a scientific thesis. In other words, the scientific thesis part is not expressed in the mathematical formulation. That is what we add ourselves (even though it is what led us to the formulation and why it might be of any value to us). Some of these interpretations have been so reliable and so useful, we tend to speak of the expressions of the theories as laws (E = mc2 being a popular one, force being proportional to rate of change of momentum being less familiar, although we experience its confirmation every day).
There is a combination of deductive process (predicting via calculation, say) and inductive formulation in this approach. We might say that (empirical) experience has shown that the interpreted deductions are reliable and that the theory is a good one in that sense.
The pitfall is to think of the theory as the truth, as somehow explaining how it is that the interpretation of findings in the theory align. Perhaps the most current conceit of this nature has to do with confusion of what nature does as computation because computational processes have some similar characteristics.
Interpreting Theories and the Reductionism Pitfall
There is an area of mathematics called "model theory" or what, here, we might call the model-theoretic view of mathematically-expressed theories.
In model theory, the idea is that a mathematical theory, expressed in a formal, logical way, is given an interpretation by identifying its mathematical elements with those in some other system (usually some other kind of mathematical one). If the interpretation is such that deductions in the first theory have results that are true in the interpretation, we say that the interpretation is valid, and that the interpretation is a model of the theory. The model satisfies the theory. I am omitting many technicalities (and probably abusing model theory) in order to appropriate the basic idea for application to interpretations in the world in mathematical sciences.
An important feature of this view is that the theory need not account for everything in the model. For the model to be a model everything that is true of the theory is true in the interpretation, but the model is not otherwise constrained. The interpretation is only for that aspect of the model that corresponds to the theory. There may many other features and aspects to the model that are simply not captured by the interpretation (and hence the theory). Under the particular interpretation, at least, however valid (in either a mathematical or an empirical sense, as the case might be), the theory has nothing to offer about those other matters. In particular, we are free from concluding that the theory explains the model or dictates its "working." We are also free, in the case of the world and many mathematical and logical theories, of having quite different interpretations in reality for models of the same theory. (Interpreting objects and phenomena as numbers is something we are able to do in innumerable ways. I had to say that.)
The reductionist pitfall is treating the theory as the model (and therefore comprehensive), and claiming the theory to be "about" the world. In that case, there is no way to countenance there being anything else about the world and even the obvious becomes inaccessible. There's some other kind of pitfall in faulting a theory for not embracing all of reality in its interpretations, but I am less concerned about that, although "reality is the model" avoids that too.
Computational Manifestations of Theories
This apparently-backward way of looking at theories is about the application of theories in ways that are useful in approaching reality. The perspective is also contrary to using computation as embodiments of theories and seeing them as somehow modeling the world. Theories may have computational models (as interpretations). This doesn't make the computation model a model of the world any more than the theory is, in this view. I say that there is a computation model of the theory, and there may be an intended interpretation in the world that is a model, but that does not make the computational model a model of the world any more than the theory is.
I find this a very fruitful way to look at a variety of aspects of information technology as it is developed and used. My continuing duty is to articulate this value in less-abstract and directly valuable terms. One curiosity is how this view can still allow for the notion that the act of programming a computer is a case of theory building.
Technorati Tags: Charles Petzold, Alan Turing, Turing Machine, Universal Turing Machine, UTM, Church-Turing Thesis, Computation Theory, On Computable Numbers, Mathematical Platonism, Brouwer, infinities
[update 2008-05-29T16:15Z: Added Petzold's 2008-05-28 post with its nice tie-in of logic and Turing.]
update 2008-05-27T16:03Z: A little tweaking in my comment on Petzold's 2008-05-25 essay.]
I received an amazon.com e-mail warning announcing that Charles Petzold's The Annotated Turing is delayed and I needed to approve the shipping delay from May 23 to June 23. A number of times, I will receive one of these announcements only to have it followed by my order shipping immediately. Since Petzold's site reports the book is scheduled to launch on June 16, I will be content to wait.
Meanwhile, Petzold has been posting interesting tid-bits that attracted his attention while working on the book. In many cases, there will be deeper coverage when the text is available. Here are some that I am particularly keen to dig further into:
|You are navigating The Miser Project.|