Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: About compiler optimizations

Author: Eugene Nalimov

Date: 09:06:58 12/23/02

Go up one level in this thread


I'd recommend you to read something before saying that it agrees with you. The
web page which URL you posted contains

"In the case of the example multiple-pass compiler,

Pass 1: The compiler driver calls the syntactic analyzer (which in turn makes
use of the lexical analyzer) which reads the original source program, parses it,
and constructs an abstract syntax tree (AST).
Pass 2: The compiler driver then calls the semantic analyzer which traverses the
AST, checks it for errors, and annotates it.
Pass 3: The compiler driver then calls the code generator which traverses the
annotated AST and generates the code"

As you see, syntax analyzer is called exactly once, for the first pass. All
other passes work on the ASTs.

The definition of "single-pass" and "multi-pass" compiler had not changed for a
long time. I just checked 1959 paper I have at home, and it is exactly the same.

I have my own opinion about yours and Bob's algorithmic computer chess
knowledge. For example, I thought that Bob teached you and helped you the last
two years, not vice verse, am I right?

Thanks,
Eugene ("f***ing idiot")

On December 23, 2002 at 09:28:30, Vincent Diepeveen wrote:

>On December 21, 2002 at 22:54:05, Eugene Nalimov wrote:
>
>>Wrong.
>>
>>In all compiler textbooks number of passes means "how much times compiler goes
>>through the program code" regardless of the program's representation -- be it
>>source or some intermediate form (quads, tuples, triades, ASTs, etc.).
>>
>>Thanks,
>>Eugene
>
>That's a different form of passes which have more to do with the difficulty
>of optimizing high level languages.
>
>Note i just quoted a statement from some researchers in the field
>of compiler optimizations.
>
>Of course that was from a few years ago. Let's be clear there. My knowledge
>is of course very limited with regards to todays compilers, like Bob's
>algorithmic computerchess knowledge is too.
>
>>On December 21, 2002 at 21:20:26, Vincent Diepeveen wrote:
>>
>>>On December 21, 2002 at 17:45:43, Matt Taylor wrote:
>>>
>>>>On December 21, 2002 at 17:29:11, Vincent Diepeveen wrote:
>>>>
>>>>>On December 21, 2002 at 14:32:18, Matt Taylor wrote:
>>>>>
>>>>>checkout the compiler faq at :
>>>>>
>>>>>http://www.cs.strath.ac.uk/~hl/classes/52.358/FAQ/passes.html
>>>>>
>>>>>[off topic nonsense removed]
>>>>
>>>>Ok, the FAQ explains to me principles which were self-evident. When you read the
>>>>FAQ, you realize that an optimizing single-pass C compiler is not possible.
>>>>
>>>>"Optimization: Only really possible with a multi-pass compiler"
>>>>
>>>>It also reaffirms what I'd already stated -- multi-pass compilers are EASIER to
>>>>write because the code is more modular and has less coupling. Just about the
>>>>only data structure that you're going to rely on to go between stages is the
>>>>AST, and that's not that difficult.
>>>>
>>>>This is quite familiar for me as I've been working on a compiler implementation
>>>>for a C-like language. (Actually it's more like C++, but it lacks multiple
>>>>inheritance and templates.)
>>>>
>>>>-Matt
>>>
>>>If you have 'so much' experience with compilers, whereas i consider myself
>>>a layman; i just wrote a few very very primitif compilers (and no assembly
>>>output of them even); i wonder why you do not know what 'single pass
>>>compiler' means. It has to do with how many times a compiler reads
>>>the source code. Not so much how many high level optimizations
>>>you apply to it.
>>>
>>>So now you learned again something.
>>>
>>>Best regards,
>>>Vincent.



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.