I'm always pessimistic about AI.
Our current computation model is deterministic. Do you remember how Dijkstra 'proves' goto statement is harmful? He use one (or serveral) natural number(s) to represent the state of your process. We can map serveral natural numbers to a rational number, so in fact we can represent any state of any process as a rational number. A Turing Machine program has only two possible results: either terminate in finite time, like finite numbers, or trap in a infinite loop, like repeating decimals. Lambda calculus is proven to be a equivalence of Turing Machine.
But our brain is non-deterministic. I have a strong feeling that you can't describe the state of brain by a rational number, but a irrational number. You can predict what's the n-th digit of a rational number, while you can't predict the n-th digit of a irrational number, you won't know it until your computation reached that point.
The problem is that rational numbers live in a closure. You can't get an irrational number by applying basic arithmetic on rational number. If we can't get irrational number from rational number, can we get a brain from Turing machine? Maybe, if we find which state does PI represent of in a program.
The foundation of our world is non-deterministic. It seems easy to build deterministic base on non-deterministic, but hard in reverse. So I guess it will be hard to build a brain base on Turing Machine. Fortunately we have a new hope at our age, named Quantum Computing, which is based on non-deterministic mechanism. I know little about how it works, but it looks totally different from old fashion computation models.
Jan 5, 2010
Nov 26, 2009
CS Education
"Programming is usually taught by examples. Experience shows that the success of a programming course critically depends on the choice of these examples. Unfortunately, they are too often selected with the prime intent to demonstrate what a computer can do. Instead, a main criterion for selection should be their suitability to exhibit certain widely applicable techniques. Furthermore, examples of programs are commonly presented as finished "products" followed by explanations of their purpose and their linguistic details. But active programming consists of the design of new programs, rather than contemplation of old programs. As a consequence of these teaching methods, the student obtains the impression that programming consists mainly of mastering a language (with all the peculiarities and intricacies so abundant in modern PL's) and relying on one's intuition to somehow transform ideas into finished programs. Clearly, programming courses should teach methods of design and construction, and the selected examples should be such that a gradual development can be nicely demonstrated. "
- "Program Development by Stepwise Refinement", Niklaus Wirth, 1995
- "Program Development by Stepwise Refinement", Niklaus Wirth, 1995
Nov 24, 2009
Browserless Web Development
note: "Web development" in this article doesn't include UI design/implementation, which means all (backend, database, html etc.) except css/javascript.
I just found a new measure on code/web-developer recently. A traditional web development loop may look like this:
An obvious problem here is, there's no room left for *automated* test. You may write *automated* test in step 2, but no one force you to do it. A better process (I think) should be like this:
So we change step 3 only, removed browser from our process. *Automated* test become an explicit step here. You can switch step 2/3 in the latter process, so you'll write test first in that case, that's not the point here so I left them unchanged. The point is if you don't have a browser at hand you'll be forced to test your code by writing code, which is automated, reusable and cool. You'll find you're working in TDD style naturally even you don't know what TDD is.
That's what I called Browserless Web Development. The less a web developer use browser to validate his work, the better he and his code are.
I just found a new measure on code/web-developer recently. A traditional web development loop may look like this:
- read new feature/story
- code
- try it in browser, if there's any problem, goto 2 (goto considered useful here)
- commit your code
An obvious problem here is, there's no room left for *automated* test. You may write *automated* test in step 2, but no one force you to do it. A better process (I think) should be like this:
- read new feature/story
- code
- write a piece of code to test code in step 2, if there's any problem, goto 2
- commit your code
So we change step 3 only, removed browser from our process. *Automated* test become an explicit step here. You can switch step 2/3 in the latter process, so you'll write test first in that case, that's not the point here so I left them unchanged. The point is if you don't have a browser at hand you'll be forced to test your code by writing code, which is automated, reusable and cool. You'll find you're working in TDD style naturally even you don't know what TDD is.
That's what I called Browserless Web Development. The less a web developer use browser to validate his work, the better he and his code are.
Oct 26, 2009
Software Engineering is ...
You know, Dijkstra is really awesome.
"Ours is the task to remember (and to remind) that, in what is now called “software engineering”, not a single sound engineering principle is involved. (On the contrary: its spokesmen take the trouble of arguing the irrelevance of the engineering principles known.) Software Engineering as it is today is just humbug; from an academic —i.e. scientific and educational— point of view it is a sham, a fraud."
"Universities are always faced with something of a problem when there is a marked discrepancy between what society asks for and what society needs. In our case the need is clear: the professional competence of the Mathematical Engineer, familiar with discrete systems design and knowing how to use formal techniques for preventing unmastered complexity from creeping in. But said war “out there” all but prevents this need from being perceived, and what our immediate industrial environment overwhelmingly seems to ask for is different brands of snake oil, Software Engineering, of course, being one of them. And as, with the recession lasting longer and longer, the external pressures on the Universities to do the wrong things only mount, it is only to be expected that part of the campus is going to be included in the battlefield."
"The task of the first-class University, however, is absolutely clear. Industry being its customer, consultancy must tell industry what it wants to hear; it is the task of the first-class University to tell industry what it does not want to hear, whereby it is the rĂ´le of its scientific authority to ensure that the sting of the academic gadfly really hurts."
-- Edsger W. Dijkstra, http://www.cs.utexas.edu/users/EWD/transcriptions/EWD11xx/EWD1165.html
Update:
It's interesting just after I read Dijkstra's article, Joel Spolsky published a new blog post "Capstone projects and time management", to some extent on the opposite side of Dijkstra. Joel wrote lots with wisdom, but this new post just looks like an april fool's joke. A smart guy wrote a perfect answer to Joel already.
"Ours is the task to remember (and to remind) that, in what is now called “software engineering”, not a single sound engineering principle is involved. (On the contrary: its spokesmen take the trouble of arguing the irrelevance of the engineering principles known.) Software Engineering as it is today is just humbug; from an academic —i.e. scientific and educational— point of view it is a sham, a fraud."
"Universities are always faced with something of a problem when there is a marked discrepancy between what society asks for and what society needs. In our case the need is clear: the professional competence of the Mathematical Engineer, familiar with discrete systems design and knowing how to use formal techniques for preventing unmastered complexity from creeping in. But said war “out there” all but prevents this need from being perceived, and what our immediate industrial environment overwhelmingly seems to ask for is different brands of snake oil, Software Engineering, of course, being one of them. And as, with the recession lasting longer and longer, the external pressures on the Universities to do the wrong things only mount, it is only to be expected that part of the campus is going to be included in the battlefield."
"The task of the first-class University, however, is absolutely clear. Industry being its customer, consultancy must tell industry what it wants to hear; it is the task of the first-class University to tell industry what it does not want to hear, whereby it is the rĂ´le of its scientific authority to ensure that the sting of the academic gadfly really hurts."
-- Edsger W. Dijkstra, http://www.cs.utexas.edu/users/EWD/transcriptions/EWD11xx/EWD1165.html
Update:
It's interesting just after I read Dijkstra's article, Joel Spolsky published a new blog post "Capstone projects and time management", to some extent on the opposite side of Dijkstra. Joel wrote lots with wisdom, but this new post just looks like an april fool's joke. A smart guy wrote a perfect answer to Joel already.
Oct 9, 2009
The Correct Refactor Flow
Read this enlightening piece here.
- Get assigned a task to implement a new feature.
- Refactor the code until that feature is as easy to add as possible.
- Add the feature.
- Submit.
update: I found this is indeed originated from Martin Fowler's amazing book "Refactoring", which is filled with ideas originated from practices of smalltalk. It's a shame I didn't read that book earlier :-( "Refactoring" (or "Refactoring: Ruby Edition") is a must read.
Sep 10, 2009
Notes on Alan Kay's "The Early History of Smalltalk"
"In computer terms, Smalltalk is a recursion on the notion of computer itself. Instead of dividing "computer stuff" into things each less strong than the whole--like data structures, procedures, and functions which are the usual paraphernalia of programming languages--each Smalltalk object is a recursion on the entire possibilities of the computer. Thus its semantics are a bit like having thousands and thousands of computer all hooked together by a very fast network." (I never think OO in this way before I read this, it changes my view of programming, made so many design pattern look naturally. This is a whole different way to explain why recursion is the root of computer (you know the original way is by lambda calculus))
"Programming languages can be categorized in a number of ways: imperative, applicative, logic-based, problem-oriented, etc. But they all seem to be either an "agglutination of features" or a "crystallization of style." COBOL, PL/1, Ada, etc., belong to the first kind; LISP, APL-- and Smalltalk--are the second kind. It is probably not an accident that the agglutinative languages all seem to have been instigated by committees, and the crystallization languages by a single person." (Very interesting observation. It seems single-person languages are more popular today)
"I could hardly believe how beautiful and wonderful the idea of LISP was. I say it this way because LISP had not only been around enough to get some honest barnacles, but worse, there wee deep falws in its logical foundations. By this, I mean that the pure language was supposed to be based on functions, but its most important components---such as lambda expressions quotes, and conds--where not functions at all, and insted ere called special forms. Landin and others had been able to get quotes and cons in terms of lambda by tricks that were variously clever and useful, but the flaw remained in the jewel. In the practical language things were better. There were not just EXPRs (which evaluated their arguments0, but FEXPRs (which did not). My next questions was, why on earth call it a functional language? Why not just base everuything on FEXPRs and force evaluation on the receiving side when needed?" (I like Alan Kay's criticism because what he wanted LISP to be looks exactly like Haskell :) He used an interesting description for LISP: 'surface beauty'. My opinion is LISP is still great, but not in practical sense now. All gurus suggest learning lisp but only for 'think different', not for using it in daily work. Alan Kay tell an incident later and said: "Watching a famous guy much smarter than I struggle for more than 30 minutes to not quite solve the problem his way (there was a bug) made quite an impression. It brought home to me once again that "point of view is worth 80 IQ points." I wasn't smarter but I had a much better internal thinking tool to amplify my abilities." )
"I didn't like meetings: didn't believe brainstorming could substitute for cool sustained thought."
"The actual beauty of LISP came more from the promise of its metastrcutures than its actual model. I spent a fair amount of time thinking about how objects could be characterized as universal computers without having to have any exceptions in the central metaphor. What seemed to be needed was complete control over what was passed in a message send; in particular when and in what environment did expressions get evaluted? "
"A simple and small system that can do interesing things also needs a "high slope"--that is a good match between the degree of interestingness and the level of complexity needed to express it. "
"The latter was deemed to be hard and would be handled by the usual method for hard problems, namely, give them to grad students. "
"Of course, the whole idea of Smalltalk (and OOP in general) is to define everything intensionally. "
"Perhaps the most important principle--again derived from operating system architectures--is that when you give someone a structure, rarely do you want them to have unlimited priviledges with it. Just doing type-matching isn't even close to what's needed. Nor is it terribly useful to have some objects protected and others not. Make them all first class citizens and protect all. "
"... this led to a 90 degree rotation of the purposed of the user interface from"access to functionality" to "environment in which users learn by doing."" (This is how overlapping windows user interface came out. I think this is also a proof why a programmer should work with keyboard+tilling windown manager: overlapping windows system, and in fact all GUI, is for end users who 'learn by doing'. Programmers are professionals, they should have already learnd their tools before doing anything, they shouldn't learn by doing. I never found a great programmer who mainly use mouse/GUI.)
"By now it was already 1979, and we found ourselves doing one of our many demos, but this time for a very interested audience: Steve Jobs, JeffRaskin, and other technical people from Apple. They had started a project called Lisa but weren't quite sure what it shouldbe like, until Jeff said to Steve, "You should really come over to PARC and see what they ae doing." Thus, more than eight years after overlapping windows had been invented and more than six years after the ALTO started running, the people who could really do something about the ideas, finally to to see them. The machine used was the Dorado, a very fast "big brother" of the ALTO, whose Smalltalk microcode had been largely written by Bruce Horn, one of our original "Smalltalk kids" who was still only a teen-ager. Larry Tesler gave the main part of the demo with Dan sitting in the copilot's chair and Adele and I watched from the rear. One of the best parts of the demo was when Steve Jobs said he didn't like the blt-style scrolling we were using and asked if we cold do it in a smooth continuous style. In less than a minute Dan found the methods involved, made the (relatively major) changes and scrolling was now continuous! This shocked the visitors, espeicially the programmers among them, as they had never seen a really powerful incremental system before. Steve tried to get and/or buy the technology from Xerox (which was one of Apple's minority venture captialists), but Xerox would neither part with it nor would come up with the resources to continue to develop it in house by funding a better NoteTaker cum Smalltalk. " (How stupid Xerox is. In fact Alan Kay has predicted personal computing 30 years ago and reported his vision to Xerox many times. But Xerox just ignored those reports. If Xerox took Alan's work/suggestions seriously, it's very likely to be Microsoft+Apple today, not merely a laser printer company.)
"One way to think about progress in software is that a lot of it has been about finding ways to late-bind"
"Hardware is really just software crystallized early. It is there to make program schemes run as efficiently as possible. But far too often the hardware has been presented as a given and it is up to software designers to make it appear reasonable. This has caused low-level techniques and excessive optimization to hold back progress in program design. ... In short, most hardware designs today are just re-optimizations of moribund architectures. "
"Objects made on different machines and with different languages should be able to talk to each other--and will have-to in the future."
"I think the enormous commercialization of personal computering has smothered much of the kind of work that used to go on in universities and research labs, by sucking the talented kids towards practical applications. With companies so risk-adverse towards doing their own HW, and the HW companies betraying no real understanding of SW, the result has been a great step backwards in most respects. "
Ref. "The Early History of Smalltalk"
"Programming languages can be categorized in a number of ways: imperative, applicative, logic-based, problem-oriented, etc. But they all seem to be either an "agglutination of features" or a "crystallization of style." COBOL, PL/1, Ada, etc., belong to the first kind; LISP, APL-- and Smalltalk--are the second kind. It is probably not an accident that the agglutinative languages all seem to have been instigated by committees, and the crystallization languages by a single person." (Very interesting observation. It seems single-person languages are more popular today)
"I could hardly believe how beautiful and wonderful the idea of LISP was. I say it this way because LISP had not only been around enough to get some honest barnacles, but worse, there wee deep falws in its logical foundations. By this, I mean that the pure language was supposed to be based on functions, but its most important components---such as lambda expressions quotes, and conds--where not functions at all, and insted ere called special forms. Landin and others had been able to get quotes and cons in terms of lambda by tricks that were variously clever and useful, but the flaw remained in the jewel. In the practical language things were better. There were not just EXPRs (which evaluated their arguments0, but FEXPRs (which did not). My next questions was, why on earth call it a functional language? Why not just base everuything on FEXPRs and force evaluation on the receiving side when needed?" (I like Alan Kay's criticism because what he wanted LISP to be looks exactly like Haskell :) He used an interesting description for LISP: 'surface beauty'. My opinion is LISP is still great, but not in practical sense now. All gurus suggest learning lisp but only for 'think different', not for using it in daily work. Alan Kay tell an incident later and said: "Watching a famous guy much smarter than I struggle for more than 30 minutes to not quite solve the problem his way (there was a bug) made quite an impression. It brought home to me once again that "point of view is worth 80 IQ points." I wasn't smarter but I had a much better internal thinking tool to amplify my abilities." )
"I didn't like meetings: didn't believe brainstorming could substitute for cool sustained thought."
"The actual beauty of LISP came more from the promise of its metastrcutures than its actual model. I spent a fair amount of time thinking about how objects could be characterized as universal computers without having to have any exceptions in the central metaphor. What seemed to be needed was complete control over what was passed in a message send; in particular when and in what environment did expressions get evaluted? "
"A simple and small system that can do interesing things also needs a "high slope"--that is a good match between the degree of interestingness and the level of complexity needed to express it. "
"The latter was deemed to be hard and would be handled by the usual method for hard problems, namely, give them to grad students. "
"Of course, the whole idea of Smalltalk (and OOP in general) is to define everything intensionally. "
"Perhaps the most important principle--again derived from operating system architectures--is that when you give someone a structure, rarely do you want them to have unlimited priviledges with it. Just doing type-matching isn't even close to what's needed. Nor is it terribly useful to have some objects protected and others not. Make them all first class citizens and protect all. "
"... this led to a 90 degree rotation of the purposed of the user interface from"access to functionality" to "environment in which users learn by doing."" (This is how overlapping windows user interface came out. I think this is also a proof why a programmer should work with keyboard+tilling windown manager: overlapping windows system, and in fact all GUI, is for end users who 'learn by doing'. Programmers are professionals, they should have already learnd their tools before doing anything, they shouldn't learn by doing. I never found a great programmer who mainly use mouse/GUI.)
"By now it was already 1979, and we found ourselves doing one of our many demos, but this time for a very interested audience: Steve Jobs, JeffRaskin, and other technical people from Apple. They had started a project called Lisa but weren't quite sure what it shouldbe like, until Jeff said to Steve, "You should really come over to PARC and see what they ae doing." Thus, more than eight years after overlapping windows had been invented and more than six years after the ALTO started running, the people who could really do something about the ideas, finally to to see them. The machine used was the Dorado, a very fast "big brother" of the ALTO, whose Smalltalk microcode had been largely written by Bruce Horn, one of our original "Smalltalk kids" who was still only a teen-ager. Larry Tesler gave the main part of the demo with Dan sitting in the copilot's chair and Adele and I watched from the rear. One of the best parts of the demo was when Steve Jobs said he didn't like the blt-style scrolling we were using and asked if we cold do it in a smooth continuous style. In less than a minute Dan found the methods involved, made the (relatively major) changes and scrolling was now continuous! This shocked the visitors, espeicially the programmers among them, as they had never seen a really powerful incremental system before. Steve tried to get and/or buy the technology from Xerox (which was one of Apple's minority venture captialists), but Xerox would neither part with it nor would come up with the resources to continue to develop it in house by funding a better NoteTaker cum Smalltalk. " (How stupid Xerox is. In fact Alan Kay has predicted personal computing 30 years ago and reported his vision to Xerox many times. But Xerox just ignored those reports. If Xerox took Alan's work/suggestions seriously, it's very likely to be Microsoft+Apple today, not merely a laser printer company.)
"One way to think about progress in software is that a lot of it has been about finding ways to late-bind"
"Hardware is really just software crystallized early. It is there to make program schemes run as efficiently as possible. But far too often the hardware has been presented as a given and it is up to software designers to make it appear reasonable. This has caused low-level techniques and excessive optimization to hold back progress in program design. ... In short, most hardware designs today are just re-optimizations of moribund architectures. "
"Objects made on different machines and with different languages should be able to talk to each other--and will have-to in the future."
"I think the enormous commercialization of personal computering has smothered much of the kind of work that used to go on in universities and research labs, by sucking the talented kids towards practical applications. With companies so risk-adverse towards doing their own HW, and the HW companies betraying no real understanding of SW, the result has been a great step backwards in most respects. "
Ref. "The Early History of Smalltalk"
Sep 2, 2009
Cross-VM Attack on EC2
Researchers from UCSD and MIT published a paper which shows vulnerability of cloud computing: cross-vm attack. With this technique a malicious user can run a new ec2 instance on the same physical machine as target vm instance, and exploit information leakage of target vm.
Read it: http://people.csail.mit.edu/tromer/papers/cloudsec.pdf
Read it: http://people.csail.mit.edu/tromer/papers/cloudsec.pdf
Subscribe to:
Posts (Atom)