I'm not so upset about them that I'm going to get in giant internet flamewars like many other people, but I'm a bit negative towards some of the additions in this release overall. I've always...
I'm not so upset about them that I'm going to get in giant internet flamewars like many other people, but I'm a bit negative towards some of the additions in this release overall. I've always thought that Python did a good job of being a capable language while still keeping its syntax straightforward and obvious. People that have no experience in Python can usually still read most Python code easily.
This release moves a little more away from that (in some cases just extending previous moves in that direction). For example, I don't think it's obvious at all what a function signature like this does (taken from the Real Python article I linked in my other comment):
defheadline(text,/,border="♦",*,width=50):
Both the / and * arguments in there are unclear, and look more like some kind of dirty hack than real language features. It's a very niche feature, so it'll probably almost never come up and I'm not very concerned about it, but I'm not a fan of enshrining more and more things like that into the official syntax.
Maybe I'm just weird, but Python's syntax has never really felt that approachable for me; I spend 90% of time either writing Typescript, Swift, PHP, Rust, or C#, for reference. I probably just...
Maybe I'm just weird, but Python's syntax has never really felt that approachable for me; I spend 90% of time either writing Typescript, Swift, PHP, Rust, or C#, for reference. I probably just don't spend enough time in Python, or am otherwise dumb, but remembering Python's specific idiosyncrasies for is, None, list functionality, Python's equivalents for map/filter/reduce, lambda functions, and so forth really frustrated me.
Not my intention to start a programming language war here, but it just never clicked in my brain. And that's okay. I just am always bemused/confused when people call Python's syntax "obvious" when to me it's anything but.
I highly recommend conda (I prefer the miniconda distro) for this purpose. Out of the box, it provides Python env separation that just works and specifying your Python version is dead easy e.g.:...
so I'm consistently using the same version of Python which isn't always obvious. Or trying to figure out how the dependencies work with pip or virtualenv or pipenv or whatever else
I highly recommend conda (I prefer the miniconda distro) for this purpose. Out of the box, it provides Python env separation that just works and specifying your Python version is dead easy e.g.:
conda create -n testenv python=3.7 other packages here
conda activate testenv
In addition there is a solid default package list, while 99% of everything else you'd want is on the conda-forge channel (which has a bunch of good things like fresh LLVM builds, scientific libraries, optimised BLAS/LAPACK, etc.). Anything that's not present on conda you can always pip install to your $CONDA_PREFIX.
I use it extensively for development and a lot of scientific analysis. It's the tool that finally got me to use non-system Python installs because unlike virtualenv and friends it didn't require any struggle to get up and running.
I even use it as a environment manager for C/C++ libraries (similarly install them to the conda prefix with the rpath set) because it's so convenient!
Essentially it's ensuring that your paths are properly set so that, e.g., you'll find the python version / packages in the 'activated' environment before anything you installed to the system...
Essentially it's ensuring that your paths are properly set so that, e.g., you'll find the python version / packages in the 'activated' environment before anything you installed to the system python (with e.g., the pip from python-pip and sudo -H pip install ...)
Hm, I'm curious as to specifically what you're having trouble with. Workflow wise, for instance, the fancy new way to do things is wit pipenv. It'll be like like npm. You go into your project...
Hm, I'm curious as to specifically what you're having trouble with.
Workflow wise, for instance, the fancy new way to do things is wit pipenv. It'll be like like npm. You go into your project directory, pipenv install <insert-package-here>, and it'll make a virtual environment and install things there.
Effectively, you just run pipenv shell and you're good to go. You can add that to your repo and run pipenv install on a new system and it'll install and make the venv.
Rapid-fire advice for project / module management: __init__.py has to be in a folder for it to be a module. I typically make my projects as ~/project/project/module/__init__.py, where module might...
Splitting up code in multiple files, I've probably looked this up the most and I still cannot figure out importing and __init__.py and all the other stuff that comes with it.
Rapid-fire advice for project / module management:
__init__.py has to be in a folder for it to be a module. I typically make my projects as ~/project/project/module/__init__.py, where module might be something like 'main', 'core', or whatever.
Think of __init__.py as a C/C++ header file -- most of the time you only want to define smaller functions in here.
Inside a module (folder), create functions similar to C++ source files, i.e., group related things together in ~/project/project/module/mycoolcode.py
Anything you want visible from outside the module, import into the __init__.py file, e.g.:
So there's a few things going on here. If we look at a bit more of the directory structure it might look like: project/ README.md setup.py project/ module/ __init__.py __main.py__ __init__.py...
So there's a few things going on here. If we look at a bit more of the directory structure it might look like:
So, the top-level project folder can actually be named anything. The reason I typically name it identically to the inner-level project folder it is so that I can install the project in editable mode (pip install -e .) while the project is still being heavily developed. This installs the project as (essentially) a symlink to the source directory has a number of nice features, including:
you don't need to worry about the setup.py, which can be complicated and will change a lot as you develop.
any updates to your project are automatically reloaded the next time you import it into a python shell; no need to reinstall.
I've found that if the top-level and inner-level project folders aren't named identically, this trick breaks (appears to be a pip issue, the last I checked). None of this is required at all (see way more about packaging here), but I find it pretty useful.
The real key here is that the inner-level project folder is the name that your package will have when you use the package externally. E.g., if you had:
Naaaayme/
README.md
setup.py
coolio/
__init__.py
you would use it externally as:
>>>fromcoolioimport...# whatever is in your package
Then subfolders (e.g., the module) above would be available via from coolio.module import ...
To reframe the question, I have to use the directory name as the module name? I can't do "foo directory is the bar module"?
You can do this as the following (assuming you stick with the ~/project/project/module/ format:
You can physically rename modules such that they don't correspond to the folder name, but that involves mucking about with the sys.modules which is typically a bad idea (and not very portable).
I started learning Go recently, and I'm beginning to appreciate all of the fancy syntactical features of languages like Python. is has no place in Python as long as you can do id(a) == id(b). Even...
I started learning Go recently, and I'm beginning to appreciate all of the fancy syntactical features of languages like Python. is has no place in Python as long as you can do id(a) == id(b). Even with one of Python's mantra being that there should only be one right way to do something, they've constructed a language where that's not the case.
I'm not writing Python at work or at home, but the fact that the type system of the language gets better with each release makes me happy. I don't think I'll ever willingly participate in...
I'm not writing Python at work or at home, but the fact that the type
system of the language gets better with each release makes me happy.
I don't think I'll ever willingly participate in maintaining a large
project written in a dynamically-typed language, but if I will, I would
really like that language to be Python.
I thought this was already in the stdlib (link to the 3.7 docs). Was something else added? The release notes don't link to anything or mention any classes specifically.
multiprocessing can now use shared memory segments to avoid pickling costs between processes
I thought this was already in the stdlib (link to the 3.7 docs). Was something else added? The release notes don't link to anything or mention any classes specifically.
I don't really like when they mess with the syntax. I learned Python last year but I'm not currently using it, if I start to use it, say, next year I will have trouble reading code because I...
I don't really like when they mess with the syntax. I learned Python last year but I'm not currently using it, if I start to use it, say, next year I will have trouble reading code because I didn't keep up with the fancy shinny new trends.
They should rather focus on rewriting several parts of the standard library that are written in a non pythonic way (i.e. do not follow pep8) from the early ways, making it a pain if you want to skim over their source code today.
I'm not so upset about them that I'm going to get in giant internet flamewars like many other people, but I'm a bit negative towards some of the additions in this release overall. I've always thought that Python did a good job of being a capable language while still keeping its syntax straightforward and obvious. People that have no experience in Python can usually still read most Python code easily.
This release moves a little more away from that (in some cases just extending previous moves in that direction). For example, I don't think it's obvious at all what a function signature like this does (taken from the Real Python article I linked in my other comment):
Both the
/
and*
arguments in there are unclear, and look more like some kind of dirty hack than real language features. It's a very niche feature, so it'll probably almost never come up and I'm not very concerned about it, but I'm not a fan of enshrining more and more things like that into the official syntax.Maybe I'm just weird, but Python's syntax has never really felt that approachable for me; I spend 90% of time either writing Typescript, Swift, PHP, Rust, or C#, for reference. I probably just don't spend enough time in Python, or am otherwise dumb, but remembering Python's specific idiosyncrasies for
is
,None
, list functionality, Python's equivalents for map/filter/reduce, lambda functions, and so forth really frustrated me.Not my intention to start a programming language war here, but it just never clicked in my brain. And that's okay. I just am always bemused/confused when people call Python's syntax "obvious" when to me it's anything but.
I highly recommend conda (I prefer the miniconda distro) for this purpose. Out of the box, it provides Python env separation that just works and specifying your Python version is dead easy e.g.:
In addition there is a solid default package list, while 99% of everything else you'd want is on the conda-forge channel (which has a bunch of good things like fresh LLVM builds, scientific libraries, optimised BLAS/LAPACK, etc.). Anything that's not present on conda you can always pip install to your
$CONDA_PREFIX
.I use it extensively for development and a lot of scientific analysis. It's the tool that finally got me to use non-system Python installs because unlike virtualenv and friends it didn't require any struggle to get up and running.
I even use it as a environment manager for C/C++ libraries (similarly install them to the conda prefix with the rpath set) because it's so convenient!
Essentially it's ensuring that your paths are properly set so that, e.g., you'll find the python version / packages in the 'activated' environment before anything you installed to the system python (with e.g., the pip from
python-pip
andsudo -H pip install ...
)Hm, I'm curious as to specifically what you're having trouble with.
Workflow wise, for instance, the fancy new way to do things is wit pipenv. It'll be like like npm. You go into your project directory,
pipenv install <insert-package-here>
, and it'll make a virtual environment and install things there.Effectively, you just run
pipenv shell
and you're good to go. You can add that to your repo and runpipenv install
on a new system and it'll install and make the venv.Rapid-fire advice for project / module management:
__init__.py
has to be in a folder for it to be a module. I typically make my projects as~/project/project/module/__init__.py
, where module might be something like 'main', 'core', or whatever.__init__.py
as a C/C++ header file -- most of the time you only want to define smaller functions in here.~/project/project/module/mycoolcode.py
__init__.py
file, e.g.:This takes care of:
a)
b)
~/project/project/module/__main__.py
, e.g.:python -m project.module
, end up in__main__.py
.it almost always will bite you in the ass if you do.
edit:typos
So there's a few things going on here. If we look at a bit more of the directory structure it might look like:
So, the top-level
project
folder can actually be named anything. The reason I typically name it identically to the inner-levelproject
folder it is so that I can install the project in editable mode (pip install -e .
) while the project is still being heavily developed. This installs the project as (essentially) a symlink to the source directory has a number of nice features, including:setup.py
, which can be complicated and will change a lot as you develop.I've found that if the top-level and inner-level
project
folders aren't named identically, this trick breaks (appears to be a pip issue, the last I checked). None of this is required at all (see way more about packaging here), but I find it pretty useful.The real key here is that the inner-level
project
folder is the name that your package will have when you use the package externally. E.g., if you had:you would use it externally as:
Then subfolders (e.g., the
module
) above would be available viafrom coolio.module import ...
You can do this as the following (assuming you stick with the
~/project/project/module/
format:Then when you use your package you can do:
You can physically rename modules such that they don't correspond to the folder name, but that involves mucking about with the
sys.modules
which is typically a bad idea (and not very portable).I started learning Go recently, and I'm beginning to appreciate all of the fancy syntactical features of languages like Python.
is
has no place in Python as long as you can doid(a) == id(b)
. Even with one of Python's mantra being that there should only be one right way to do something, they've constructed a language where that's not the case.Here's a pretty good writeup that goes through some of the new features in a lot more detail: https://realpython.com/python38-new-features/
I'm not writing Python at work or at home, but the fact that the type system of the language gets better with each release makes me happy. I don't think I'll ever willingly participate in maintaining a large project written in a dynamically-typed language, but if I will, I would really like that language to be Python.
I thought this was already in the stdlib (link to the 3.7 docs). Was something else added? The release notes don't link to anything or mention any classes specifically.
I don't really like when they mess with the syntax. I learned Python last year but I'm not currently using it, if I start to use it, say, next year I will have trouble reading code because I didn't keep up with the fancy shinny new trends.
They should rather focus on rewriting several parts of the standard library that are written in a non pythonic way (i.e. do not follow pep8) from the early ways, making it a pain if you want to skim over their source code today.
The performance upgrades are welcome though.