Skip to content
Projeler
Gruplar
Parçacıklar
Yardım
Yükleniyor...
Oturum aç / Kaydol
Gezinmeyi değiştir
C
cpython
Proje
Proje
Ayrıntılar
Etkinlik
Cycle Analytics
Depo (repository)
Depo (repository)
Dosyalar
Kayıtlar (commit)
Dallar (branch)
Etiketler
Katkıda bulunanlar
Grafik
Karşılaştır
Grafikler
Konular (issue)
0
Konular (issue)
0
Liste
Pano
Etiketler
Kilometre Taşları
Birleştirme (merge) Talepleri
0
Birleştirme (merge) Talepleri
0
CI / CD
CI / CD
İş akışları (pipeline)
İşler
Zamanlamalar
Grafikler
Paketler
Paketler
Wiki
Wiki
Parçacıklar
Parçacıklar
Üyeler
Üyeler
Collapse sidebar
Close sidebar
Etkinlik
Grafik
Grafikler
Yeni bir konu (issue) oluştur
İşler
Kayıtlar (commit)
Konu (issue) Panoları
Kenar çubuğunu aç
Batuhan Osman TASKAYA
cpython
Commits
ff8d0873
Kaydet (Commit)
ff8d0873
authored
Ara 29, 2015
tarafından
Berker Peksag
Dosyalara gözat
Seçenekler
Dosyalara Gözat
İndir
Eposta Yamaları
Sade Fark
Issue #25977: Fix typos in Lib/tokenize.py
Patch by John Walker.
üst
3cc8f4b9
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
4 deletions
+4
-4
tokenize.py
Lib/tokenize.py
+4
-4
No files found.
Lib/tokenize.py
Dosyayı görüntüle @
ff8d0873
...
@@ -328,8 +328,8 @@ def untokenize(iterable):
...
@@ -328,8 +328,8 @@ def untokenize(iterable):
Round-trip invariant for full input:
Round-trip invariant for full input:
Untokenized source will match input source exactly
Untokenized source will match input source exactly
Round-trip invariant for limited in
t
put:
Round-trip invariant for limited input:
# Output bytes will tokenize
the
back to the input
# Output bytes will tokenize back to the input
t1 = [tok[:2] for tok in tokenize(f.readline)]
t1 = [tok[:2] for tok in tokenize(f.readline)]
newcode = untokenize(t1)
newcode = untokenize(t1)
readline = BytesIO(newcode).readline
readline = BytesIO(newcode).readline
...
@@ -465,10 +465,10 @@ def open(filename):
...
@@ -465,10 +465,10 @@ def open(filename):
def
tokenize
(
readline
):
def
tokenize
(
readline
):
"""
"""
The tokenize() generator requires one argment, readline, which
The tokenize() generator requires one arg
u
ment, readline, which
must be a callable object which provides the same interface as the
must be a callable object which provides the same interface as the
readline() method of built-in file objects. Each call to the function
readline() method of built-in file objects. Each call to the function
should return one line of input as bytes. Alternately, readline
should return one line of input as bytes. Alternat
iv
ely, readline
can be a callable function terminating with StopIteration:
can be a callable function terminating with StopIteration:
readline = open(myfile, 'rb').__next__ # Example of alternate readline
readline = open(myfile, 'rb').__next__ # Example of alternate readline
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment