In Which We Learn to Code
Feb. 13th, 2021 10:54 pmEver have those days where you are absolutely certain you have somehow messed up the process despite getting something that produces the answer the book asks for?
I am learning about lists and loops in my Python book. Logically this specific exercise is clearly an extension of the whole "x = x + 1" conceptual problem which I had to get Mathfriend to explain to me in very small words but have a good handle on now.
You are given a list: xs = [12, 10, 32, 3, 66, 17, 42, 99, 20]
The assignment is to find the product of the list using a loop.
This works:
total = int(1)
for xs in [12, 10, 32, 3, 66, 17, 42, 99, 20]:
total = int(total * xs)
print(total)
It produces the desired result. If you omit setting total to 1 at the beginning it complains about total being undefined farther down, which I get. It depends on itself; it needs to start at something. And setting it to start at 1 doesn't mess with the end result. (The previous exercise was addition and it started at zero.)
I cannot shake the feeling I am getting some part of this wrong in some way, possibly in this being the wrong approach to it, but I can't figure out another possible one with the terms the book has described so far. Especially when the addition exercise did explicitly say "set it to zero to start." I just feel like a more elegant way to do it should exist.
(Also welcome to the posts where I complain about my coding lessons. Particularly in self-teaching I find it easier to actually sit down to do things if I'm writing up a Dreamwidth post about them, so you'll be getting some chronicling of my Adventures in Code coming up.)
I am learning about lists and loops in my Python book. Logically this specific exercise is clearly an extension of the whole "x = x + 1" conceptual problem which I had to get Mathfriend to explain to me in very small words but have a good handle on now.
You are given a list: xs = [12, 10, 32, 3, 66, 17, 42, 99, 20]
The assignment is to find the product of the list using a loop.
This works:
total = int(1)
for xs in [12, 10, 32, 3, 66, 17, 42, 99, 20]:
total = int(total * xs)
print(total)
It produces the desired result. If you omit setting total to 1 at the beginning it complains about total being undefined farther down, which I get. It depends on itself; it needs to start at something. And setting it to start at 1 doesn't mess with the end result. (The previous exercise was addition and it started at zero.)
I cannot shake the feeling I am getting some part of this wrong in some way, possibly in this being the wrong approach to it, but I can't figure out another possible one with the terms the book has described so far. Especially when the addition exercise did explicitly say "set it to zero to start." I just feel like a more elegant way to do it should exist.
(Also welcome to the posts where I complain about my coding lessons. Particularly in self-teaching I find it easier to actually sit down to do things if I'm writing up a Dreamwidth post about them, so you'll be getting some chronicling of my Adventures in Code coming up.)
no subject
Date: 2021-02-23 06:33 pm (UTC)Oh, good for them! That's the right computer-science way to do it, and modern functional-programming courses often start that way (I'm currently facilitating a Scala course for a diverse group at work that does that), but it's still uncommon, yes.
no subject
Date: 2021-02-24 11:06 am (UTC)