Skip to content Skip to sidebar Skip to footer

Timing Out Urllib2 Urlopen Operation In Python 2.4

I've just inherited some Python code and need to fix a bug as soon as possible. I have very little Python knowledge so please excuse my ignorance. I am using urllib2 to extract dat

Solution 1:

You can achieve this using signals.

Here's an example of my signal decorator that you can use to set the timeout for individual functions.

Ps. not sure if this is syntactically correct for 2.4. I'm using 2.6 but the 2.4 supports signals.

import signal
import time

classTimeOutException(Exception):
    passdeftimeout(seconds, *args, **kwargs):
    deffn(f):
        defwrapped_fn(*args, **kwargs):
            signal.signal(signal.SIGALRM, handler)
            signal.alarm(seconds)
            f(*args, **kwargs)
        return wrapped_fn
    return fn

defhandler(signum, frame):
    raise TimeOutException("Timeout")

@timeout(5)defmy_function_that_takes_long(time_to_sleep):
    time.sleep(time_to_sleep)

if __name__ == '__main__':
    print'Calling function that takes 2 seconds'try:
        my_function_that_takes_long(2)
    except TimeOutException:
        print'Timed out'print'Calling function that takes 10 seconds'try:
        my_function_that_takes_long(10)
    except TimeOutException:
        print'Timed out'

Solution 2:

It's right there in the function.

urllib2.urlopen(url[, data][, timeout])

e.g:

urllib2.urlopen("www.google.com", data, 5)

Post a Comment for "Timing Out Urllib2 Urlopen Operation In Python 2.4"