I really think Processing’s loadStrings()
and saveStrings()
are super-great, and I use them all the time when I need to read text from and write text into a file.
But I’m writing some instructional materials and I wanted to give my readers a taste of Python’s common with open(file ... as f:
idioms. Just in case they want to use other Python interpreters somewhere else.
For reading Unicode strings from a file io.open()
seemed to work better than ‘normal’ open()
(non-ASCII characters encoding came wrong otherwise):
from io import open as io_open
with io_open("data/my_unicode_strings.txt", 'r') as f:
line_list = f.readlines()
Then, for wrinting strings… open()
works fine, but io.open()
doesn’t
# from io import open # not a good idea because
with open(output_path, 'w') as f: # open(output_path, 'w', enconding='utf-8') # not working for me
for li in line_list:
f.write(li + u'ãéíôú') # works fine with 'normal' open()
So I’m a bit frustrated with this inconsistency. Any insights on what I might be missing?
Cheers!