import_that: African rock python (snake)
2014-07-05 06:00 pm
Entry tags:

A tale of yak shaving

This is a couple of years old now, but still interesting: Barry Warsaw, one of the Python core developers, shares the tale of an hairy debugging experience and the yak shaving needed to solve it:

Everyone who reported the problem said the TypeError was getting thrown on the for-statement line. The exception message indicated that Python was getting some object that it was trying to convert to an integer, but was failing. How could you possible get that exception when either making a copy of a list or iterating over that copy? Was the list corrupted? Was it not actually a list but some list-like object that was somehow returning non-integers for its min and max indexes?
import_that: XKCD guy flying with Python (Default)
2014-05-02 12:51 pm
Entry tags:

Changing Python's prompt

I have a mild dislike of Python's default prompt, ">>>". Not that the prompt itself is bad, but when you copy from an interactive session and paste into email or a Usenet post, the prompt clashes with the standard > email quote marker. So I've changed my first level prompt to "py>" to distinguish interactive sessions from email quotes. Doing so is very simple:

import sys
sys.ps1 = 'py> '

Note the space at the end.

You can change the second level prompt (by default, "...") by assigning to sys.ps2, but I haven't bothered. Both prompts support arbitrary objects, not just strings, which you can use to implement dynamic prompts similar to those iPython uses. Here's a simple example of numbering the prompts:

class Prompt:
    def __init__(self):
        self.count = 0
    def __str__(self):
        self.count += 1
        return "[%4d] " % self.count

sys.ps1 = Prompt()

If you're trying to use coloured prompts, there are some subtitles to be aware of. You have to escape any non-printing characters. See this bug report for details.

You can have Python automatically use your custom prompt by setting it your startup file. If the environment variable PYTHONSTARTUPFILE is set, Python will run the file named in that environment variable when you start the interactive interpreter. As I am using Linux for my desktop, I have the following line in my .bashrc file to set the environment variable each time I log in:

export PYTHONSTARTUP=/home/steve/python/

and the startup file itself then sets the prompt, as shown above.
import_that: XKCD guy flying with Python (Default)
2014-04-28 11:20 am
Entry tags:

try-finally oddity

There's a curious oddity with Python's treatment of return inside try...finally blocks:

py> def test():
...     try:
...         return 23
...     finally:
...         return 42
py> test()

While it seems a little odd, I don't think we can really call it a "gotcha", as it shouldn't be all that surprising. The finally block is guaranteed to run when the try block is left, however it is left. The first return sets the return value to 23, the second resets it to 42.

It should be no surprise that finally can raise an exception:

py> def test():
...     try: return 23
...     finally: raise ValueError
py> test()
Traceback (most recent call last):
  File "", line 1, in 
  File "", line 3, in test

A little less obvious is that it can also swallow exceptions:

py> def test():
...     try: raise ValueError
...     finally: return 42
py> test()

Earlier, I wrote that the finally block is guaranteed to run. That's not quite true. The finally block won't run if control never leaves the try block:

    while True:
    print('this never gets reached')

It also won't run if Python is killed by the operating system, e.g. in response to kill -9, or following loss of power and other such catastrophic failures.

There are also two official ways to force Python to exit without running cleanup code, including code in finally blocks: os.abort and os._exit. Both should be used only for specialized purposes. Normally you should exit your Python programs by:

  1. Falling out the bottom of the program. When there is no more code to run, Python exits cleanly.

  2. Calling sys.exit.

  3. Raising SystemExit.
import_that: XKCD guy flying with Python (Default)
2014-04-23 02:20 am
Entry tags:

Classic and new classes

One of the historical oddities of Python is that there are two different kinds of classes, so-called "classic" or "old-style" classes, and "new-style" classes or types. To understand the reason, we have to go back to the dawn of time and very early Python.

Back in the early Python 1.x days, built-in types like int, str and list were completely separate from classes you created with the class keyword. Built-in types were written in C, classes were written in Python, and the two could not be combined. So in Python 1.5, you couldn't inherit from built-in types:

>>> class MyInt(int):
...     pass
Traceback (innermost last):
  File "", line 1, in ?
TypeError: base is not a class object

If you wanted to inherit behaviour from a built-in type, the only way to do so was by using delegation, a powerful and useful technique that is unfortunately underused today. At the time though, it was the only way to solve the problem of subclassing from built-in types, leading to useful recipes like this one from Alex Martelli.

But in Python 2.2, classes were unified with built-in types. This involved a surprising number of changes to the behaviour of classes, so for backwards-compatibility, both the old and the new behaviour was kept. Rather than introduce a new keyword, Python 2.2 used a simple set of rules:

  • If you define a class that doesn't inherit from anything, it is an old-style classic class.

  • If your class inherits from another classic class, it too is a classic class.

  • But if you inherit from a built-in type (e.g. dict or list) then it is a new-style class and the new rules apply.

  • Since Python supports multiple inheritance, you might inherit from both new-style and old-style classes; in that case, your class will be new-style.

To aid with this, a new built-in type was created, object. object is the parent type of all new-style classes and types, but not old-style classes. Curiously, isinstance(classic_instance, object) still returns True, even though classic_instance doesn't actually inherit from object. At least that is the case in Python 2.4 and above, I haven't tested 2.2 or 2.3.

There are a few technical differences in behaviour between the old- and new-style classes, including:

  • super only works in new-style classes.

  • Likewise for classmethod and staticmethod.

  • Properties appear to work in classic classes, but actually don't. They seem to work so long as you only read from them, but if you assign to a property, the setter method is not called and the property is replaced.

  • In general, descriptors of any sort only work with new-style classes.

  • New-style classes support __slots__ as a memory optimization.

  • The rules for resolving inheritance of old- and new-style classes are slightly different, with old-style classes' method resolution order being inconsistent under certain circumstances.

  • New-style classes optimize operator overloading by skipping dunder methods like __add__ which are defined on the instance rather than the class.

The complexity of having two kinds of class was always intended to be a temporary measure, and in Python 3, classic classes were finally removed for good. Now, all classes inherit from object, even if you just write class MyClass: with no explicit bases.
import_that: XKCD guy flying with Python (Default)
2014-04-21 02:29 pm
Entry tags:

More on language popularity

Recently, I wrote about the various ways of measuring language popularity, and I thought I'd add another two.

TrendySkills measures popularity of IT technologies (not just languages) by extracting information from job advertisements. It's currently showing Python at number 9, just ahead of C and just behind HTML5. Despite the name, the site doesn't appear to measure trends as such (what technologies are becoming more popular or less popular), but only snapshots of current popularity. It also mixes data collected from numerous countries, including the USA, Spain and Sweden. I don't think the job market is truly world-wide, not even in IT, so that seems a weakness to me: just because a technology is popular in one country doesn't mean it will be equally popular in another.

RedMonk periodically posts a graph of language popularity based on GitHub and StackOverflow. They find Python in position 5, sandwiched between C# and C++.

However you measure it, there's no doubt that Python is one of the most popular and influential languages around.
import_that: 2.8 inside a red circle with a line crossing it out (no-python2.8)
2014-04-15 01:05 pm
Entry tags:

Another 2.8 proposal

There's yet another call for Python 2.8, or at least for discussion for 2.8. Martijn Faassen discusses what Python 2.8 should be, even though it won't be. Frankly, I'm not entirely sure what motivates Martijn to write this post — he seems to have accepted that there won't be a Python 2.8 and isn't asking for that decision to be rethought., so I'm not entirely sure why he wants to discuss 2.8, but let's treat this as a serious suggestion.

Martijn suggests that, paradoxically, the best way to handle the Python 2.x to 3.x transition is for 2.8 to break backwards compatibility with 2.7. I thought we had that version. Isn't it called Python 3.x? Not according to Martijn, who wants to add — or rather since he explicitly says he won't be doing the work, he wants somebody else to add — a whole series of extra compiler options in the form of from __future3__ and from __past__ imports. Like __future__, they will presumably behave like compiler directives and change the behaviour of Python.
Read more... )
import_that: Bar and line graph (graph)
2014-04-09 11:37 pm
Entry tags:

More on variance

Earlier, I discussed some of the terminology and different uses for the two variance functions in Python 3.4's new statistics module. That was partly motivated by a question from Andrew Szeto, who asked me why the variance functions took optional mu or xbar parameters.

The two standard deviation functions stdev and pstdev are just thin wrappers that return the square root of the variance, so they have the same signature as the variance functions.

I had a few motives for including the second parameter, in no particular order:

  1. The reason given in the PEP was that I took the idea from the GNU Scientific Library. Perhaps they know something I don't? (I actually thought of the idea independently, but when I was writing PEP 450 I expected this to be controversial, and was pleased to find prior art.)

  2. It allows a neat micro-optimization to avoid having to recalculate the mean if you've already calculated it. If you have a large data set, or one with custom numeric types where the __add__ method is expensive, calculating the mean once instead of twice may save some time.

  3. Mathematically, the variance is in some sense a function dependent on μ (mu) or x̄ (xbar). Making them parameters of the Python functions reflects that sense.

  4. Variance has a nice interpretation in physics: it's the moment of inertia around the centre of mass. We can calculate the moment of inertia around any point, not just the centre — might we not also calculate the "variance" around some point other than the mean? If you want to abuse the variance function by passing (say) the median instead of the mean as xbar or mu, you can. But if you do, you're responsible for ensuring that the result is physically meaningful. Don't come complaining to me if you get a negative variance or some other wacky value.

  5. It also allows you to calculate an improved sample variance by passing the known population mean to the pvariance function — see my earlier post or Wikipedia for details.

Individually, none of theses were especially strong, and probably wouldn't have justified the additional complexity on their own. But taken together I think they justified including the optional parameters.

Surprisingly (at least to me), I don't recall much if any opposition to these mu/xbar parameters. I expected this feature would be a lot more controversial than it turned out to be. If I recall correctly, there was more bike-shedding about what to call the parameters (I initially just called them "m", for mu/mean) than whether or not to include them.
import_that: Bar and line graph (graph)
2014-03-20 03:02 pm
Entry tags:

Population and sample variance

I am the author of PEP 450 and the statistics module. That module offers four different functions related to statistical variance, and some people may not quite understand what the difference between them.

[Disclaimer: statistical variance is complicated, and my discussion here is quite simplified. In particular, most of what I say only applies to "reasonable" data sets which aren't too skewed or unusual, and samples which are random and representative. If your sample data is not representative of the population from which it is drawn, then all bets are off.]

The statistics module offers two variance functions, pvariance and variance, and two corresponding versions of the standard deviation, pstdev and stdev. The standard deviation functions are just thin wrappers which take the square root of the appropriate variance function, so there's not a lot to say about them. Except where noted differently, everything I say about the (p)variance functions also applies to the (p)stdev functions, so for brevity I will only talk about variance.

The two versions of variance give obviously different results:

py> import statistics
py> data = [1, 2, 3, 3, 3, 5, 8]
py> statistics.pvariance(data)
py> statistics.variance(data)

So which should you use? In a nutshell, two simple rules apply:

  • If you are dealing with the entire population, use pvariance.

  • If you are working with a sample, use variance instead.

If you remember those two rules, you won't go badly wrong. Or at least, no more badly than most naive users of statistical functions. You want to be better than them, don't you? Then read on...

Read more... )
import_that: XKCD guy flying with Python (Default)
2014-03-19 12:30 pm
Entry tags:


Nathaniel Smith describes the culture of the python-ideas and python-dev mailing lists:

We're more of the love, bikeshedding, and rhetoric school. Well, we can do you bikeshedding and love without the rhetoric, and we can do you bikeshedding and rhetoric without the love, and we can do you all three concurrent or consecutive. But we can't give you love and rhetoric without the bikeshedding. Bikeshedding is compulsory.

Bike-shedding gets a bad rap, and deservedly so. But bike-shedding can also be a sign of passion and attention to detail. Sometimes it really does matter what colour the bike-shed is: "colour" (syntax) can have functional and practical consequences, even for real-life bike-sheds. Dark colours tend to absorb and retain more heat than light colours. Syntax matters. Languages which feel like a harmonious whole — even if the design sometimes only makes sense if you are Dutch — require that the designers care about the little details of syntax and spelling. Even though functionally there would be little difference, Python would be a very different language indeed if we spelled this:

def func(x, y):
    return x/(x+y)

as this:

: func OVER + / ;

It would in fact be Forth.

So let's hear it for a little bit of bike-shedding — but not too much!
import_that: XKCD guy flying with Python (Default)
2014-03-05 11:48 am
Entry tags:

Language popularity

What's the most popular programming language in the world? Or at least those parts of the English-speaking world which are easily found on the Internet? C, C++, Java, PHP, Javascript, VB? Is Python on the rise on in decline? How do we know?

There are a few websites which make the attempt to measure programming language popularity, for some definition of "popularity". Since they all have different methods of measuring popularity, and choose different proxies to measure (things like the number of job ads or the number of on-line tutorials for a language), they give different results — sometimes quite radically different, which is a strong indicator that even if language popularity has a single objective definition (and it probably doesn't) none of these methods are measuring it.

So keeping in mind that any discussion of language popularity should be taken with a considerable pinch of salt, let's have a look at four well-known sites that try to measure popularity.

Read more... )
import_that: XKCD guy flying with Python (Default)
2014-02-07 07:43 pm
Entry tags:

Does Python pass by reference or value?

One topic which comes up from time to time, usually generating a lot of heat and not much light, is the question of how Python passes values to functions and methods. Usually the question is posed as "Does Python use pass-by-reference or pass-by-value?".

The answer is, neither. Python, like most modern object-oriented languages, uses an argument passing strategy first named by one of the pioneers of object-oriented programming, Barbara Liskov, in 1974 for the language CLU. Liskov named it pass-by-object-sharing, or just pass-by-sharing, but the actual strategy is a lot older, being the same as how Lisp passes arguments. Despite this august background, most people outside of Python circles have never heard of pass-by-sharing, and consequently there is a lot of confusion about argument passing terminology.

Let's start by looking at how people get confused. Let's start by "proving" that Python is pass-by-value (also know as call-by-value), then we'll "prove" that Python is pass-by-reference (call-by-reference). It's actually neither, but if you think that there are only two ways to pass arguments to a function, you might be fooled into thinking Python uses both. Read more... )
import_that: XKCD guy flying with Python (Default)
2014-01-15 11:11 pm
Entry tags:

Lies in code

Quote of the week:

At Resolver we've found it useful to short-circuit any doubt and just refer to comments in code as 'lies'.

-- Michael Foord paraphrases Christian Muirhead on python-dev.
import_that: XKCD guy flying with Python (Default)
2014-01-15 08:06 pm
Entry tags:

When to use assert

Python's assert statement is a very useful feature that unfortunately often gets misused. assert takes an expression and an optional error message, evaluates the expression, and if it gives a true value, does nothing. If the expression evaluates to a false value, it raises an AssertionError exception with optional error message. For example:

py> x = 23
py> assert x > 0, "x is zero or negative"
py> assert x%2 == 0, "x is an odd number"
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AssertionError: x is an odd number

Many people use assertions as a quick and easy way to raise an exception if an argument is given the wrong value. But this is wrong, badly wrong, for two reasons. Read more... )