|
| 1 | +--- |
| 2 | +layout: post |
| 3 | +title: "Python 3.7 feature walkthrough" |
| 4 | +date: "2019-01-12 16:20:41 +0530" |
| 5 | +tag: |
| 6 | + - Python |
| 7 | + - CorePython |
| 8 | + - Python3.7 |
| 9 | +--- |
| 10 | + |
| 11 | + |
| 12 | +In this post, I will try to explain improvements done in Core Python version |
| 13 | +3.7. Below is the summary of features covered in this post. |
| 14 | + |
| 15 | +* Breakpoints |
| 16 | + |
| 17 | +* Subprocess |
| 18 | + |
| 19 | +* Dataclass |
| 20 | + |
| 21 | +* Int with underscores |
| 22 | + |
| 23 | +* Namedtuples |
| 24 | + |
| 25 | +* Hash-based Python object file |
| 26 | + |
| 27 | +### breakpoint() |
| 28 | + |
| 29 | +Breakpoint is an extremely important tool for debugging. Since I started |
| 30 | +learning Python, I am using the same API for putting breakpoints. With this |
| 31 | +release, ```breakpoint()``` is introduced as a built-in function. Because it is |
| 32 | +in a built-in scope, you don't have to import it from any module. You can call |
| 33 | +this function to put breakpoints in your code. This approach is handier than |
| 34 | +importing ```pdf.set_trace()```. |
| 35 | + |
| 36 | + |
| 37 | + |
| 38 | +Code used in above example |
| 39 | + |
| 40 | +```python |
| 41 | +for i in range(100): |
| 42 | + if i == 10: |
| 43 | + breakpoint() |
| 44 | + else: |
| 45 | + print(i) |
| 46 | + |
| 47 | +``` |
| 48 | + |
| 49 | +### PYTHONBREAKPOINT |
| 50 | + |
| 51 | +There wasn't any handy option to |
| 52 | +disable or enable existing breakpoints with single flag. But with this release |
| 53 | +you can certainly reduce your pain by using ```PYTHONBREAKPOINT``` environment |
| 54 | +variable. You can disable all breakpoints in your code by setting the environment variable |
| 55 | +```PYTHONBREAKPOINT``` to ```0```. |
| 56 | + |
| 57 | + |
| 58 | + |
| 59 | + |
| 60 | +##### I advise to put "PYTHONBREAKPOINT=0" in your production environment to avoid unwanted pausing at forgotten breakpoints |
| 61 | + |
| 62 | + |
| 63 | +### Subprocess.run(capture_output=True) |
| 64 | + |
| 65 | +You can pipe the output of Standard Output Stream (stdout) and Standard Error |
| 66 | +Stream (stderr) by enabling ```capture_output``` parameter of |
| 67 | +```subprocess.run()``` function. |
| 68 | + |
| 69 | + |
| 70 | + |
| 71 | +You should note that it is an improvement over piping the stream menually. For |
| 72 | +example, ```subprocess.run(["ls", "-l", "/var"], stdout=subprocess.PIPE, |
| 73 | +stderr=subprocess.PIPE)``` was the previous approach to capture the output of |
| 74 | +```stdout``` and ```stderr```. |
| 75 | + |
| 76 | + |
| 77 | +### Dataclasses |
| 78 | + |
| 79 | +The new class level decorator ```@dataclass``` introduced with ```dataclasses``` |
| 80 | +module. Python is well-known for developing features which allows to achieving |
| 81 | +more by writing less. This module seems to get more updates in future which will |
| 82 | +reduce many lines of your code. Basic understanding of Typehints is expected to |
| 83 | +understand this feature. |
| 84 | + |
| 85 | +When you wrap your class with the ```@dataclass``` decorator, the decoractor |
| 86 | +will put obvious constructor code for you. Additionally, it defines a behaviour |
| 87 | +for dander methods ```__repr__()```, ```__eq__()``` and ```__hash__()```. |
| 88 | + |
| 89 | + |
| 90 | + |
| 91 | + |
| 92 | +Below is the code before introducing a ```dataclasses.dataclass``` decorator. |
| 93 | + |
| 94 | +```python |
| 95 | +class Point: |
| 96 | + |
| 97 | + def __init__(self, x, y): |
| 98 | + self.x = x |
| 99 | + self.y = y |
| 100 | +``` |
| 101 | + |
| 102 | + |
| 103 | +After wrapping with ```@dataclass``` decorator it reduces to below code |
| 104 | + |
| 105 | +```python |
| 106 | +from dataclasses import dataclass |
| 107 | + |
| 108 | + |
| 109 | +@dataclass |
| 110 | +class Point: |
| 111 | + x: float |
| 112 | + y: float |
| 113 | +``` |
| 114 | + |
| 115 | + |
| 116 | +### Namedtuples |
| 117 | + |
| 118 | +The namedtuples are very helpful data structure, yet I found it is less known |
| 119 | +amongs developers. With this release, you can set default values to argument |
| 120 | +variables. |
| 121 | + |
| 122 | + |
| 123 | + |
| 124 | +##### Note: Default arguments will be assigned from left to right. In the above example, default value ``2`` will be assigned to variable ``y`` |
| 125 | + |
| 126 | +Below is the code used in the example |
| 127 | + |
| 128 | +```python |
| 129 | +from collections import namedtuple |
| 130 | + |
| 131 | + |
| 132 | +Point = namedtuple("Point", ["x", "y"], defaults=[2,]) |
| 133 | +p = Point(1) |
| 134 | +print(p) |
| 135 | +``` |
| 136 | + |
| 137 | +### .pyc |
| 138 | + |
| 139 | +***.pyc** are object files generated everytime you change your code file (.py). |
| 140 | +It is collection of meta-data created by an interpreter for executed code. |
| 141 | +Interpreter will use this data when you re-execute this code next time. Present |
| 142 | +approach to identify an outdated object file is done by comparing meta fields of |
| 143 | +source code file like last edited date. With this release, that identification |
| 144 | +process is improved by comparing files using a hash-based approach. Hash based |
| 145 | +approach is quick and consistent accorss various platforms than comparing last |
| 146 | +edited dates. This improvement is considered unstable. Core python will |
| 147 | +continue with the metadata approach and slowly migrate to hash based approach. |
0 commit comments