Skip to content

Commit 587a5d5

Browse files
authored
Replace black with ruff linter / formatter (#392)
## Problem While researching how to integrate some rust extensions into our python, I heard about this alternative tool for code formatting that also does linting. This tool is much faster than black and can also do codestyle linting to enforce some best practices for python. Ruff is implemented by [Astral](https://docs.astral.sh/), which is a company aiming to create a "cargo for python" type experience with python tools that are implemented in rust making them both fast and reliable. I'm interested in some other stuff they are doing (e.g. uv for dependency management), but this seemed like a lower stakes way to test the waters with their stuff. ## Solution Main changes are in: - Dev dependency changes: add ruff, remove black - Remove black configs in pyproject.toml. - Add ruff configs to pyproject.toml. Mostly stuck with defaults, although I disabled a few of the more annoying lint rules for now on a per-file basis. - Adjust CI to run ruff checks instead of black. - Update CONTRIBUTING The rest of this large diff is due to formatting and lint fixes for various code style things. These checks can be run manually with `poetry run ruff check --fix` and `poetry run ruff format`, but otherwise they should automatically be triggered by pre-commit hooks. I documented this in CONTRIBUTING. ## Type of Change - [x] Infrastructure change (CI configs, etc) ## Test Plan Tests should still be green. No functional impact expected.
1 parent a41b9f8 commit 587a5d5

File tree

84 files changed

+1109
-832
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

84 files changed

+1109
-832
lines changed

.github/workflows/lint.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ jobs:
66
runs-on: ubuntu-latest
77
steps:
88
- uses: actions/checkout@v3
9-
- uses: psf/black@stable
9+
- uses: chartboost/ruff-action@v1

.pre-commit-config.yaml

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,12 @@ repos:
66
- id: end-of-file-fixer
77
- id: check-yaml
88
- id: check-added-large-files
9-
- repo: https://github.com/psf/black-pre-commit-mirror
10-
rev: 24.4.2
9+
- repo: https://github.com/astral-sh/ruff-pre-commit
10+
# Ruff version.
11+
rev: v0.6.7
1112
hooks:
12-
- id: black
13-
language_version: python3.12
13+
# Run the linter.
14+
- id: ruff
15+
args: [ --fix ]
16+
# Run the formatter.
17+
- id: ruff-format

CONTRIBUTING.md

Lines changed: 78 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Contributing
1+
# Contributing
22

33
## Installing development versions
44

@@ -17,45 +17,45 @@ poetry add git+https://github.com/pinecone-io/pinecone-python-client.git@44fc7ed
1717
```
1818

1919

20-
## Developing locally with Poetry
20+
## Developing locally with Poetry
2121

2222
[Poetry](https://python-poetry.org/) is a tool that combines [virtualenv](https://virtualenv.pypa.io/en/latest/) usage with dependency management, to provide a consistent experience for project maintainers and contributors who need to develop the pinecone-python-client
23-
as a library.
23+
as a library.
2424

25-
A common need when making changes to the Pinecone client is to test your changes against existing Python code or Jupyter Notebooks that `pip install` the Pinecone Python client as a library.
25+
A common need when making changes to the Pinecone client is to test your changes against existing Python code or Jupyter Notebooks that `pip install` the Pinecone Python client as a library.
2626

27-
Developers want to be able to see their changes to the library immediately reflected in their main application code, as well as to track all changes they make in git, so that they can be contributed back in the form of a pull request.
27+
Developers want to be able to see their changes to the library immediately reflected in their main application code, as well as to track all changes they make in git, so that they can be contributed back in the form of a pull request.
2828

29-
The Pinecone Python client therefore supports Poetry as its primary means of enabling a consistent local development experience. This guide will walk you through the setup process so that you can:
29+
The Pinecone Python client therefore supports Poetry as its primary means of enabling a consistent local development experience. This guide will walk you through the setup process so that you can:
3030
1. Make local changes to the Pinecone Python client that are separated from your system's Python installation
3131
2. Make local changes to the Pinecone Python client that are immediately reflected in other local code that imports the pinecone client
3232
3. Track all your local changes to the Pinecone Python client so that you can contribute your fixes and feature additions back via GitHub pull requests
3333

3434
### Step 1. Fork the Pinecone python client repository
3535

36-
On the [GitHub repository page](https://github.com/pinecone-io/pinecone-python-client) page, click the fork button at the top of the screen and create a personal fork of the repository:
36+
On the [GitHub repository page](https://github.com/pinecone-io/pinecone-python-client) page, click the fork button at the top of the screen and create a personal fork of the repository:
3737

3838
![Create a GitHub fork of the Pinecone Python client](./docs/pinecone-python-client-fork.png)
3939

40-
It will take a few seconds for your fork to be ready. When it's ready, **clone your fork** of the Pinecone python client repository to your machine.
40+
It will take a few seconds for your fork to be ready. When it's ready, **clone your fork** of the Pinecone python client repository to your machine.
4141

42-
Change directory into the repository, as we'll be setting up a virtualenv from within the root of the repository.
42+
Change directory into the repository, as we'll be setting up a virtualenv from within the root of the repository.
4343

44-
### Step 1. Install Poetry
44+
### Step 1. Install Poetry
4545

46-
Visit [the Poetry site](https://python-poetry.org/) for installation instructions.
46+
Visit [the Poetry site](https://python-poetry.org/) for installation instructions.
4747

48-
### Step 2. Install dependencies
48+
### Step 2. Install dependencies
4949

50-
Run `poetry install` from the root of the project.
50+
Run `poetry install` from the root of the project.
5151

5252
### Step 3. Activate the Poetry virtual environment and verify success
5353

54-
Run `poetry shell` from the root of the project. At this point, you now have a virtualenv set up in this directory, which you can verify by running:
54+
Run `poetry shell` from the root of the project. At this point, you now have a virtualenv set up in this directory, which you can verify by running:
5555

5656
`poetry env info`
5757

58-
You should see something similar to the following output:
58+
You should see something similar to the following output:
5959

6060
```bash
6161
Virtualenv
@@ -73,17 +73,61 @@ Path: /home/linuxbrew/.linuxbrew/opt/[email protected]
7373
```
7474
If you want to extract only the path to your new virtualenv, you can run `poetry env info --path`
7575

76-
## Loading your virtualenv in another shell
76+
### Step 4. Enable pre-commit hooks.
7777

78-
It's a common need when developing against this client to load it as part of some other application or Jupyter Notebook code, modify
79-
it directly, see your changes reflected immediately and also have your changes tracked in git so you can contribute them back.
78+
Run `poetry run pre-commit install` to enable checks to run when you commit so you don't have to find out during your CI run that minor lint issues need to be addressed.
8079

81-
It's important to understand that, by default, if you open a new shell or terminal window, or, for example, a new pane in a tmux session,
82-
your new shell will not yet reference the new virtualenv you created in the previous step.
80+
## Common tasks
81+
82+
### Running tests
83+
84+
- Unit tests: `make test-unit`
85+
- Integration tests: `PINECONE_API_KEY="YOUR API KEY" make test-integration`
86+
- Run the tests in a single file: `poetry run pytest tests/unit/data/test_bulk_import.py -s -vv`
87+
88+
### Running the ruff linter / formatter
89+
90+
These should automatically trigger if you have enabled pre-commit hooks with `poetry run pre-commit install`. But in case you want to trigger these yourself, you can run them like this:
91+
92+
```
93+
poetry run ruff check --fix # lint rules
94+
poetry run ruff format # formatting
95+
```
96+
97+
If you want to adjust the behavior of ruff, configurations are in `pyproject.toml`.
98+
99+
100+
### Consuming API version upgrades
101+
102+
These instructions can only be followed by Pinecone employees with access to our private APIs repository.
103+
104+
Prerequisites:
105+
- You must be an employee with access to private Pinecone repositories
106+
- You must have [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed and running. Our code generation script uses a dockerized version of the OpenAPI CLI.
107+
- You must have initialized the git submodules under codegen
108+
109+
```sh
110+
git submodule
111+
```
112+
113+
To regenerate the generated portions of the client with the latest version of the API specifications, you need to have Docker Desktop running on your local machine.
114+
115+
```sh
116+
./codegen/
117+
```
118+
119+
120+
## Loading your virtualenv in another shell
121+
122+
It's a common need when developing against this client to load it as part of some other application or Jupyter Notebook code, modify
123+
it directly, see your changes reflected immediately and also have your changes tracked in git so you can contribute them back.
124+
125+
It's important to understand that, by default, if you open a new shell or terminal window, or, for example, a new pane in a tmux session,
126+
your new shell will not yet reference the new virtualenv you created in the previous step.
83127

84128
### Step 1. Get the path to your virtualenv
85129

86-
We're going to first get the path to the virtualenv we just created, by running:
130+
We're going to first get the path to the virtualenv we just created, by running:
87131

88132
```bash
89133
poetry env info --path
@@ -93,75 +137,52 @@ You'll get a path similar to this one: `/home/youruser/.cache/pypoetry/virtuale
93137

94138
### Step 2. Load your existing virtualenv in your new shell
95139

96-
Within this path is a shell script that lives at `<your-virtualenv-path>/bin/activate`. Importantly, you cannot simply run this script, but you
97-
must instead source it like so:
140+
Within this path is a shell script that lives at `<your-virtualenv-path>/bin/activate`. Importantly, you cannot simply run this script, but you
141+
must instead source it like so:
98142

99143
```bash
100144
source /home/youruser/.cache/pypoetry/virtualenvs/pinecone-fWu70vbC-py3.9/bin/activate
101145
```
102146
In the above example, ensure you're using your own virtualenv path as returned by `poetry env info --path`.
103147

104-
### Step 3. Test out your virtualenv
148+
### Step 3. Test out your virtualenv
105149

106-
Now, we can test that our virtualenv is working properly by adding a new test module and function to the `pinecone` client within our virtualenv
107-
and running it from the second shell.
150+
Now, we can test that our virtualenv is working properly by adding a new test module and function to the `pinecone` client within our virtualenv
151+
and running it from the second shell.
108152

109153
#### Create a new test file in pinecone-python-client
110-
In the root of your working directory of the `pinecone-python-client` where you first ran `poetry shell`, add a new file named `hello_virtualenv.py` under the `pinecone` folder.
154+
In the root of your working directory of the `pinecone-python-client` where you first ran `poetry shell`, add a new file named `hello_virtualenv.py` under the `pinecone` folder.
111155

112-
In that file write the following:
156+
In that file write the following:
113157

114158
```python
115159
def hello():
116160
print("Hello, from your virtualenv!")
117161
```
118-
Save the file.
162+
Save the file.
119163

120-
#### Create a new test file in your second shell
121-
This step demonstrates how you can immediately test your latest Pinecone client code from any local Python application or Jupyter Notebook:
164+
#### Create a new test file in your second shell
165+
This step demonstrates how you can immediately test your latest Pinecone client code from any local Python application or Jupyter Notebook:
122166

123-
In your second shell, where you ran `source` to load your virtualenv, create a python file named `test.py` and write the following:
167+
In your second shell, where you ran `source` to load your virtualenv, create a python file named `test.py` and write the following:
124168

125169
```python
126170
from pinecone import hello_virtualenv
127171

128172
hello_virtualenv.hello()
129173
```
130174

131-
Save the file. Run it with your Python binary. Depending on your system, this may either be `python` or `python3`:
175+
Save the file. Run it with your Python binary. Depending on your system, this may either be `python` or `python3`:
132176

133177
```bash
134178
python3 test.py
135179
```
136180

137-
You should see the following output:
181+
You should see the following output:
138182

139183
```bash
140184
❯ python3 test.py
141185
Hello, from your virtualenv!
142186
```
143187

144188
If you experience any issues please [file a new issue](https://github.com/pinecone-io/pinecone-python-client/issues/new).
145-
146-
147-
## Consuming API version upgrades
148-
149-
These instructions can only be followed by Pinecone employees with access to our private APIs repository.
150-
151-
Prerequisites:
152-
- You must be an employee with access to private Pinecone repositories
153-
- You must have [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed and running. Our code generation script uses a dockerized version of the OpenAPI CLI.
154-
- You must have initialized the git submodules under codegen
155-
156-
```sh
157-
git submodule
158-
```
159-
160-
161-
To regenerate the generated portions of the client with the latest version of the API specifications, you need to have Docker Desktop running on your local machine.
162-
163-
164-
165-
```sh
166-
./codegen/
167-
```

codegen/build-oas.sh

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ generate_client() {
7171

7272
oas_file="codegen/apis/_build/${version}/${module_name}_${version}.oas.yaml"
7373
package_name="pinecone.${py_module_name}.openapi.${module_name}"
74-
74+
7575
verify_file_exists $oas_file
7676
verify_directory_exists $template_dir
7777

@@ -106,9 +106,9 @@ extract_shared_classes() {
106106
# Define the list of shared source files
107107
sharedFiles=(
108108
"api_client"
109-
"configuration"
110-
"exceptions"
111-
"model_utils"
109+
"configuration"
110+
"exceptions"
111+
"model_utils"
112112
"rest"
113113
)
114114

@@ -127,7 +127,7 @@ extract_shared_classes() {
127127
done
128128
done
129129

130-
# Remove the docstring headers that aren't really correct in the
130+
# Remove the docstring headers that aren't really correct in the
131131
# context of this new shared package structure
132132
find "$target_directory" -name "*.py" -print0 | xargs -0 -I {} sh -c 'sed -i "" "/^\"\"\"/,/^\"\"\"/d" "{}"'
133133

@@ -166,4 +166,4 @@ done
166166
extract_shared_classes
167167

168168
# Format generated files
169-
poetry run black "${destination}"
169+
poetry run ruff format "${destination}"

pinecone/config/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,5 +4,5 @@
44
from .config import ConfigBuilder, Config
55
from .pinecone_config import PineconeConfig
66

7-
if os.getenv("PINECONE_DEBUG") != None:
7+
if os.getenv("PINECONE_DEBUG") is not None:
88
logging.basicConfig(level=logging.DEBUG)

pinecone/config/config.py

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,7 @@
33

44
from pinecone.exceptions.exceptions import PineconeConfigurationError
55
from pinecone.config.openapi import OpenApiConfigFactory
6-
from pinecone.core.openapi.shared.configuration import (
7-
Configuration as OpenApiConfiguration,
8-
)
6+
from pinecone.core.openapi.shared.configuration import Configuration as OpenApiConfiguration
97
from pinecone.utils import normalize_host
108
from pinecone.utils.constants import SOURCE_TAG
119

@@ -72,15 +70,11 @@ def build(
7270

7371
@staticmethod
7472
def build_openapi_config(
75-
config: Config,
76-
openapi_config: Optional[OpenApiConfiguration] = None,
77-
**kwargs,
73+
config: Config, openapi_config: Optional[OpenApiConfiguration] = None, **kwargs
7874
) -> OpenApiConfiguration:
7975
if openapi_config:
8076
openapi_config = OpenApiConfigFactory.copy(
81-
openapi_config=openapi_config,
82-
api_key=config.api_key,
83-
host=config.host,
77+
openapi_config=openapi_config, api_key=config.api_key, host=config.host
8478
)
8579
elif openapi_config is None:
8680
openapi_config = OpenApiConfigFactory.build(api_key=config.api_key, host=config.host)
@@ -95,7 +89,7 @@ def build_openapi_config(
9589
openapi_config.proxy_headers = config.proxy_headers
9690
if config.ssl_ca_certs:
9791
openapi_config.ssl_ca_cert = config.ssl_ca_certs
98-
if config.ssl_verify != None:
92+
if config.ssl_verify is not None:
9993
openapi_config.verify_ssl = config.ssl_verify
10094

10195
return openapi_config

pinecone/config/openapi.py

Lines changed: 5 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,7 @@
77

88
from urllib3.connection import HTTPConnection
99

10-
from pinecone.core.openapi.shared.configuration import (
11-
Configuration as OpenApiConfiguration,
12-
)
10+
from pinecone.core.openapi.shared.configuration import Configuration as OpenApiConfiguration
1311

1412
TCP_KEEPINTVL = 60 # Sec
1513
TCP_KEEPIDLE = 300 # Sec
@@ -29,7 +27,9 @@ def build(cls, api_key: str, host: Optional[str] = None, **kwargs):
2927
return openapi_config
3028

3129
@classmethod
32-
def copy(cls, openapi_config: OpenApiConfiguration, api_key: str, host: str) -> OpenApiConfiguration:
30+
def copy(
31+
cls, openapi_config: OpenApiConfiguration, api_key: str, host: str
32+
) -> OpenApiConfiguration:
3333
"""
3434
Copy a user-supplied openapi configuration and update it with the user's api key and host.
3535
If they have not specified other socket configuration, we will use the default values.
@@ -88,13 +88,7 @@ def _get_socket_options(
8888
and hasattr(socket, "TCP_KEEPCNT")
8989
):
9090
socket_params += [(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, keep_alive_idle_sec)]
91-
socket_params += [
92-
(
93-
socket.IPPROTO_TCP,
94-
socket.TCP_KEEPINTVL,
95-
keep_alive_interval_sec,
96-
)
97-
]
91+
socket_params += [(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, keep_alive_interval_sec)]
9892
socket_params += [(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, keep_alive_tries)]
9993

10094
# TCP Keep Alive Probes for Windows OS

pinecone/config/pinecone_config.py

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,12 @@ def build(
1717
additional_headers: Optional[Dict[str, str]] = {},
1818
**kwargs,
1919
) -> Config:
20-
host = host or kwargs.get("host") or os.getenv("PINECONE_CONTROLLER_HOST") or DEFAULT_CONTROLLER_HOST
20+
host = (
21+
host
22+
or kwargs.get("host")
23+
or os.getenv("PINECONE_CONTROLLER_HOST")
24+
or DEFAULT_CONTROLLER_HOST
25+
)
2126
headers_json = os.getenv("PINECONE_ADDITIONAL_HEADERS")
2227
if headers_json:
2328
try:
@@ -27,8 +32,5 @@ def build(
2732
logger.warn(f"Ignoring PINECONE_ADDITIONAL_HEADERS: {e}")
2833

2934
return ConfigBuilder.build(
30-
api_key=api_key,
31-
host=host,
32-
additional_headers=additional_headers,
33-
**kwargs,
35+
api_key=api_key, host=host, additional_headers=additional_headers, **kwargs
3436
)

0 commit comments

Comments
 (0)