Tut 02 - Functions and Conditionals

Thanks for my tutor Eric Han!

Problem Set 3

Problem 3.1

Problem 3.2

Problem 3.2(a)

One basic idea to let the function call by itself is to extract each element one by one from the list.

Flowchart for the naive recursive solution

However, we may notice that using this method, our time complexity will be the same as the one without using recursion. That's because in both these two methods, we are extracting one element from the list and add them to our result.

So, How can we optimize our algorithm?

Can we use the idea of "half" in this problem?

The answer is yes. We can divide our problem into smaller ones by halving the list we are about to add. More generally speaking, we recursively divide the final answer to be left half sum + right half sum.

Flowchart for the optimzied recursion

Also, notice that for our list, its last element is L[list1]L[list -1], and in this flowchart, we are assuming that sum() is left-closed and right-open. Otherwise, we should make some small changes below

Changes.c
// Base case
left == right - 1

// Recursion cases
sum(left, (left + right) / 2, list)
sum((left + right) / 2, right, list)
Is this truly "optimized"

No! Till the end, no matter which method we use, we have to get each of the element from the list so that we can calculate the sum. So, the time complexity for these three methods are totally the same!

Problem 3.2(b)

Same as the Problem 3.2(a), now our task is to calculate iji^j, which can be viewed as

ij=i×i××ij times\large i^j=\underbrace{i \times i \times \cdots \times i}_{j \text{ times}}

Using this idea, we can form our solution

Flowchart for the naive recursive solution

Now, we should ask ourselves the same questions. Can we still optimize it? The answer is absolutely yes right. Using the similar idea we have introduced in Problem 3.2(a), we can apply "half" on this problem as well.

You can find the solution at power.c.

Is this algorithm truly optimized?

Yes! Did you remember that in the Problem 3.2(a), by using the idea of "half", our algorithm is not truly optimized. But in this probelm, the algorithm is truly optimized.

We can think of this example, if we want to calculate 2102^{10}

Using the naive recursion, we need to calculate 29,28,...,22,21.2^9,2^8,...,2^2,2^1. That means we will call our pow(i, j) for 9 times.

However, using the optimized recursion, we only need to know 45,82,1614^5,8^2,16^1. We have optimized our times to 4 now.

If the number is bigger, we will find that the optimized version is much much faster than the naive recursion solution.

Problem Set 8

Problem 8.1

Problem 8.1(a)

Flowchart for (q)

From this chart, we can directly see that the last condition if (x==y) is duplicate. But let's take a step down further, our max can only be either x or y, that means we only need one condition check. So, we can see that if (x < y) is actually also duplicate.

Problem 8.1(b)

Flowchart for 8.1(b)

Now, this is the succinct one.

Problem 8.2

Flowchart for 8.2
So, what's the use of starting from the middle instead of starting from the start?

This kind of method will make sure every time we will check for 2 times while the one that starts from the first will check 3 times in the worst case and 1 time in the best case.

Then, based on our flowchart, we can easily convert it to the if-else statement

if (score >= 5)
{
    if (score >= 8)
    {
        printf("A");
    }
    else
    {
        printf("B");
    }
}
else
{
    if (score >= 3)
    {
        printf("C");
    }
    else
    {
        printf("D");
    }
}

Last updated