Kirby offered Euclid's Algorithm as an example of a recursive function:

> def gcd(a,b):

> if b:

> return gcd(b, a % b)

> return a

This example *is* wondrous for demonstrating recursion, because it's so

elegant and terse. But IMHO, these qualities make it less than ideal for

*teaching* recursion. In particular, the terminal case is implied instead of

defined: "if b == 0, then return a as the GCD of the original numbers".

Here's an implementation that makes more pedagogical sense to me:

def gcd(x,y):

# terminal case:

# smaller goes into larger evenly, or

# smaller has been whittled down to 1

if x % y == 0 or y == 1:

return y

# recursive case:

# smaller does not goes into larger evenly

# GCD must go into the _remainder_ evenly

else:

return gcd(y, x % y)

Sure, including the "y == 1" case as a separate test might be overdoing it a

bit (but it does eliminate one recursion).

Another problem (both with Kirby's implementation and mine) is that you "get

lucky": if the user specifies the smaller number and the larger number in

the "wrong" order, the algorithm magically swaps them for the next

recursion. When tackling other recursion problems, students probably won't

be able to count on such luck.

-John

_______________________________________________

Edu-sig mailing list

[hidden email]
http://mail.python.org/mailman/listinfo/edu-sig