1. 05 Kas, 2002 1 kayıt (commit)
  2. 24 Agu, 2002 1 kayıt (commit)
  3. 23 May, 2002 1 kayıt (commit)
  4. 15 May, 2002 1 kayıt (commit)
  5. 01 Nis, 2002 1 kayıt (commit)
  6. 26 Mar, 2002 1 kayıt (commit)
  7. 30 Agu, 2001 1 kayıt (commit)
  8. 08 Agu, 2001 1 kayıt (commit)
  9. 20 Tem, 2001 1 kayıt (commit)
  10. 15 Tem, 2001 1 kayıt (commit)
  11. 29 Haz, 2001 1 kayıt (commit)
    • Tim Peters's avatar
      Turns out Neil didn't intend for *all* of his gen-branch work to get · 4efb6e96
      Tim Peters yazdı
      committed.
      
      tokenize.py:  I like these changes, and have tested them extensively
      without even realizing it, so I just updated the docstring and the docs.
      
      tabnanny.py:  Also liked this, but did a little code fiddling.  I should
      really rewrite this to *exploit* generators, but that's near the bottom
      of my effort/benefit scale so doubt I'll get to it anytime soon (it
      would be most useful as a non-trivial example of ideal use of generators;
      but test_generators.py has already grown plenty of food-for-thought
      examples).
      
      inspect.py:  I'm sure Ping intended for this to continue running even
      under 1.5.2, so I reverted this to the last pre-gen-branch version.  The
      "bugfix" I checked in in-between was actually repairing a bug *introduced*
      by the conversion to generators, so it's OK that the reverted version
      doesn't reflect that checkin.
      4efb6e96
  12. 18 Haz, 2001 1 kayıt (commit)
  13. 23 Mar, 2001 1 kayıt (commit)
  14. 01 Mar, 2001 3 kayıt (commit)
  15. 09 Şub, 2001 1 kayıt (commit)
  16. 15 Ock, 2001 2 kayıt (commit)
  17. 07 Eki, 2000 1 kayıt (commit)
    • Tim Peters's avatar
      Possible fix for Skip's bug 116136 (sre recursion limit hit in tokenize.py). · de49583a
      Tim Peters yazdı
      tokenize.py has always used naive regexps for matching string literals,
      and that appears to trigger the sre recursion limit on Skip's platform (he
      has very long single-line string literals).  Replaced all of tokenize.py's
      string regexps with the "unrolled" forms used in IDLE, where they're known to
      handle even absurd (multi-megabyte!) string literals without trouble.  See
      Friedl's book for explanation (at heart, the naive regexps create a backtracking
      choice point for each character in the literal, while the unrolled forms create
      none).
      de49583a
  18. 24 Agu, 2000 1 kayıt (commit)
  19. 17 Agu, 2000 1 kayıt (commit)
  20. 03 Nis, 1998 1 kayıt (commit)
  21. 27 Eki, 1997 2 kayıt (commit)
  22. 03 Haz, 1997 1 kayıt (commit)
  23. 09 Nis, 1997 1 kayıt (commit)
  24. 08 Nis, 1997 1 kayıt (commit)
  25. 10 Mar, 1997 1 kayıt (commit)
  26. 07 Mar, 1997 2 kayıt (commit)
  27. 11 Kas, 1993 1 kayıt (commit)
  28. 16 Mar, 1992 1 kayıt (commit)
  29. 01 Ock, 1992 1 kayıt (commit)