Compiler Design and Construction Question Bank Solution 2076

Compiler Design and Construction Question Bank Solution 2076

Explore comprehensive solutions to Compiler Design and Construction questions, updated for the year 2076. Customize this content for different subjects and years to enhance your SEO.

Construct SLR parsing table for the following grammar.
S -> aAa | bAb | ba

false

Explain briefly about different phases involved in compiler, with a block diagram

- Hamro CSIT

fig: Block diagram of phases of compiler

Following are the phases of compiler:

1. Lexical Anlysis :

Lexical analysis is the process where rge source program is read from left to right and grouped into tokens.

eg:

while (i > 0)

Token Description
while Keyword
( Left parenthesis
i identifier
> greater than symbol
0 integer constant
) right parenthesis

2. Syntatic Anlysis :

Here, it checks if arrangement of tokens follows grammar rules of programming language. It creates the syntatic structure of the given source program.

eg: new_val = old_val + 12.

- Hamro CSIT

3. Semantic Anlysis :

It performs a very importantrole to check the semantic rules of the expression according to the source language. It is required when the compilee may require performing some additional checks.

- Hamro CSIT

4. Intermediate Code generator :

It generates a simple machine independent intermediate language. It should be generated in such a way that it can be easily translated into target machine code.

eg:      A =b+c*d/f

intermediate code for above expression will be:

T1 = c*d

T2 = T1/f

T3 = b+T2

A = T3

5.  Code Optimization:

It is the process of removing unnecessary part of a code. It decreases the time ad space complexity of program.

eg:

Before code optimization

b = 0

T1 = a+b

T2= c*T1

a = T2

After code optimization

b = 0

a = c*a

6. Code generator:

It generates the assembly code for the target CPU from an optimized intermediate representation of program.

eg:

A = b+c*d/f

MOV c,R1

MOV d,R1

MOV f,R1

MOV b,R1

MOV R1,A

 

Given a regular expression (ε + 0)*10. Construct the DFA recognizing the pattern described by this regular expression using syntax
tree based reduction.

given regular expression

(ε + 0)* 10

1. The augmented  R.E of given R.E is

(ε + 0)* 10#

2. Syntax tree of above R.E is

- Hamro CSIT

3. Compute followpos

followpos (1) = {1,2}

followpos (2) = {3}

followpos (3) = {4}

followpos (4) = Φ

4. Creare DFA for R.E

Starting state of DFA = firstpos(root)

={1,2}

=s0

for s= {1,2}

for 0: followpos (1) = {1,2} = s0

for 1: followpos (2) = {3} = s1

 

for s1 = {3}

for 0: followpos (3) = {4} = s2

for 1: followpos (4) = {Φ} = s3

 

for s2 = {4}

for 0: followpos (Φ) = {Φ} = s3

for 0: followpos (Φ) = {Φ} = s3

 

for s3 = {Φ}

for 0: followpos (Φ) = {Φ} = s3

for 0: followpos (Φ) = {Φ} = s3

 

final state is sas it contains #

- Hamro CSIT

What is shift reduce parsing techniques? Show shift reduce parsing action for the string (x+x)*a, given the grammar

Shift Reduced Parsing is a type of buttom-up parsing method used in syntax analysis which is a part of the compiler design process. It uses a stack to hold the grammar symbol and an input buffer to hold the input string w.

This technique involves two primary operations i.e shift and reduced.

  • Shift : The parser reads the next input symbol and pushes it onto the stack.
  • Reduce: The parser applies a production rule in reverse, replacing a sequence of symbol on the stack with the non-terminal on the left-hand side of the production.

Now,

given string :  (x+x)*a

given grammar :  S → S+S | S*S | (S) | X

Shift reduce parsing action is shown in table below:

- Hamro CSIT

Define Syntax directed definition. Construct annotated parse tree for the input expression (5*3+2)*5 according to the following
syntax directed definition.

- Hamro CSIT

SDD i.e Syntax Directed Defination is a formal way to specify the syntax and symantic of a programming language. It is a context-free grammar togetherwith attributes and rules.

for eg:

Production Semantic Rules
E → T E.val = T.val
T → T* F T.val = T1.val * F.val

The annoted parse trees for input expression:

(5*3+2)*5

- Hamro CSIT

fig : Annoted parse tree for expression (5*3+2)*5

Write Syntax Directed Definition to carry out type checking for the following expression.
E -> id | E1 op E2 | E1 relop E2 | E1[E2] | E1 ↑

given expression

E → id | E1 op E2 | E1 relop E2 | E1[E2] | E1 ↑

1. E → id

  •   E type = lookup (id.entry)

2. E → E1 op E2

  •   if E. type == E2 . type and E. type in {int, float} then E . type = E. type
  •  E . type = type – error

3. E → E1 relop E2

  •   if E. type == E2 . type and E. type in {int, float} then E . type = bool
  •  E . type = error

4. E → E1[E2]

  •  if E. type == array and E2 . type == int then E . type = E1 . type.element_type
  •  E . type = type – error

5. E → E1 ↑

  • if E. type == pointer then E . type = E1 . type.pointed_type
  •  E . type = type – error

Explain with example about different methods of intermediate code representation.

Intermediate code representation is an important aspect of compiler design that bridges the gap between high-level source code and low-level machine code. It allows for optimization and transformation that improves the effeciency and performance of final generated code.

Method of intermediate code representation are:

  • Graphical representation
  • Prefix notation
  • Three address code

a. Graphical representation (syntax tree and DAG) :

  • Syntax tree are tree representation of the syntatic structure of source code. Each node represents a construct occouring in the source code. eg: for expression: a = b+c*d syntax tree is :

- Hamro CSIT

fig: syntax tree

  • DAG (Directed Acyclic Graph) is used to represent expression and optimize computations by eliminatry common sub-expression. For eg : DAG for expression a+a*(b-c)+(b-c)*d

- Hamro CSIT

fig: DAG

 

b. Prefix notation:

It is a linearized representation of syntax tree, it is a list of nodes of trees in which a node appears immediately after its children.

For eg: (a+b)*c

  •  Postfix notation = ab +c*

a*(b+c)

  • Postfix notation = abc+*

c. Three-Address code :

It is a type of intermediate representation that uses a sequence of instructions, each typically involving at most three address (two operands and one result).

Format :

x=y op z

where,

x,y,z = variable

op = operator

for eg : Expression x+y*z

Three address code representation:

t1 = y*z

t2 = x+t1

 

 

What is the purpose of code optimization? Explain different types of loop optimization techniques with example.

It is the process of improving the efficiency of code by making it run faster consume less memory, or use lower resources while preserving the original functionality and output.

Purpose of code optimization are as follow:

  • Optimized code has faster execution speed.
  • Optimized code utilizes the memory efficiency.
  • Optimized code gives better performance.

Loop optimized technique types are given below:

a. Code Motion / Frequency Reduction / Loop invarient code optimization:

It is a technique which moves the code outside the loop without affecting the semantics of the program.

for eg:

Before Optimization After Optimization
while (i<100) t = sin(A) * cos(A);
{ while (i<100)
x = i*sin(A)*cos(A) {
} x = i* t;
}

b. Loop jamming / Loop fusion:

In this method several loops are merged to one loop. For eg:

Before Optimization After Optimization
for i = 1 to n do for i = 1 to n *m do
for j = 1 to m do a[i] = 10
a[i,j] = 10

c. Loop unrolling:

In this technique, the number of jumps and tests can be reduced by writting code two times. For eg:

Before Optimization After Optimization
int i = 1; int i = 1;
while (i<=100) while (i<=100)
{ {
a[i] = b[i]; a[i] = b[i];
i++; i++;
} a[i] = b[i];
i++;
}

 

Discuss the importance of error handler in compiler. How is it manipulated in the different phases of compilation?

Importance of error handler in compiler

  1. Error detection : Identifying mistakes that the violate syntax, semantic or other constraints of the programming language.
  2. Error reporting : Informing the programmer about the nature and location of errors with clear and helpful messages.
  3. Error recovery : Allowing the compiler to continue processing and analysing the remaining code after encountering an error, instead of terminating permaturely.

Manipulation of error handler in different phases of compilation.

Lexical analysis :

  • Error Types : Invalid character unterminated strings, incorrect tokens.
  • Error Handling : Skips invalid characters issues error messages, continues with next valid tokens.

Syntax Analysis :

  • Error Types : Syntax error like missing semicolon unbalanced parenthesis, incorrect statement structure.
  • Error Handling : User panic mode, phrase-level recovery, and error production to recover and continue parsing.

Semantic Analysis :

  • Error types : Type mismatches ,undeclared variables,invalid type casts,scope errors
  • Error handling : Uses symbol tables and type information to check consistency

Intermediate code generator:

  • Error Types : Error in intermediate representation like invalid operations for a given types incorrect address.
  • Error handling : Ensures intermediate code  contraning are met and report issue affecting further compilation stages.

Code optimization:

  • Error Types :Invald transformation that alter program semantics
  • Error handling :Validates transformations preserv original behavior and handles internal errors for correct optimization

Code generation:

  • Error Types :Machine- level errors like invalid machine code instruction , register allocation issues
  • Error handling : Ensures machine code correctness and efficiency,addressing errors from earlier phases

 

Discuss about the different factors affecting target code generation.

Target code genration the process of translating an intermediate representation (IR) or high-level language into machine code or a lower – level representation.

Here’s are the different factors affecting target  code generation.

  1. Architecture : Target code generation is influenced by the specific hardware architecture and its features.
  2. Calling convention : Code generation must adhere to rules for parametor passing , return values , interaperability between modules.
  3. Optimization goals :Depending on priorities like speed , size , or energy efficiency , the compiler applies different optmization technique , during code generation.
  4. Intruction selection :Choosing appropriate machine instruction based on the available instrction set , addressing modes , and  optimization goals is crucial.
  5. Data representation :Optimizing memory access patterns , data size , and alignment affects code efficiency.
  6. Control flow : Handling control flow constructs like loops and conditionals efficiently improves code performance.

For more detailed insights on Compiler Design and Construction and additional subjects, stay tuned for regular updates and expanded question bank solutions.

Post a Comment

Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.