Why does (1+1) and 1 return 1 instead of 2?
Why does 1/0 or True evaluate to error instead of True?
Why does (1+1) and 1 return 1 instead of 2?
Why does 1/0 or True evaluate to error instead of True?
Your first result is due to short-circuit evaluation. Here's how Python evaluates the expression:
Reduce (1 + 1) to 2.
Consider 2 to be "truthy" in a boolean context.
Evaluate and return the right hand operand of and.
Note that if the left hand operand had been "falsy", it would have been returned immediately, and Python would not have attempted to evaluate the right hand operand.
By the way, or works in a similar way: it returns the left hand operand if it's "truthy", and the right hand operand if not.
Note that in no case do and or or ever independently decide to return True or False, or convert any value to or from True or False. It's always one of the two operands that is returned (unless your program crashes – see below), regardless of whether they are of type bool.
Your second result is simply a division by zero error. Like most languages, Python treats division by zero as an error, because the result is undefined in mathematics for real numbers. As soon as Python tries to evaluate 1/0, your program crashes and everything stops – Python never gets the chance to even look at the right hand operand.
However, because of the short-circuiting behaviour described above, the following expression:
True or 1/0
evaluates to True even though 1/0 would have caused an error, because Python never tries to evaluate the right hand operand at all in this case.
Hard to argue with the interpreter/bytecode:
In [1]: import dis
In [2]: dis.dis('(1+1) and 1')
1 0 LOAD_CONST 1 (2)
2 JUMP_IF_FALSE_OR_POP 6
4 LOAD_CONST 0 (1)
>> 6 RETURN_VALUE
First, Python precompiles 1+1 to 2.
Then it says, "If the value is a falsey value, jump to 6 (i.e. return)". 2 is plainly not false, so it loads 1 and that's the result.