Skip to content
This repository was archived by the owner on Aug 28, 2025. It is now read-only.

Commit aa39ef1

Browse files
speediedanBorda
andauthored
sphinx conf warning fix (#175)
* fix sphinx language warning * fix stale ffmpeg ref #178 Co-authored-by: Jirka <[email protected]>
1 parent 3fda6eb commit aa39ef1

File tree

6 files changed

+6
-2
lines changed

6 files changed

+6
-2
lines changed

.azure/ipynb-publish.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,7 @@ jobs:
6262
6363
- bash: |
6464
set -e
65+
sudo apt-get update -q --fix-missing
6566
sudo apt install -y tree ffmpeg
6667
pip --version
6768
pip install --requirement requirements.txt

.azure/ipynb-tests.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,7 @@ jobs:
4242
4343
- bash: |
4444
set -e
45+
sudo apt-get update -q --fix-missing
4546
sudo apt install -y tree ffmpeg
4647
pip --version
4748
pip install --requirement requirements.txt

.github/workflows/ci_testing.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@ jobs:
3939

4040
- name: Install dependencies
4141
run: |
42+
sudo apt-get update -q --fix-missing
4243
sudo apt install -y ffmpeg
4344
pip --version
4445
pip install --requirement requirements.txt --find-links https://download.pytorch.org/whl/cpu/torch_stable.html

docs/source/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,7 @@
9999
#
100100
# This is also used if you do content translation via gettext catalogs.
101101
# Usually you set "language" from the command line for these cases.
102-
language = None
102+
language = "en"
103103

104104
# List of patterns, relative to source directory, that match files and
105105
# directories to ignore when looking for source files.

lightning_examples/finetuning-scheduler/.meta.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,6 @@ description: |
1515
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.
1616
requirements:
1717
- finetuning-scheduler[examples]
18+
- hydra-core>=1.1.0
1819
accelerator:
1920
- GPU

lightning_examples/finetuning-scheduler/finetuning-scheduler.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@
5959
# %% [markdown]
6060
# ## The Default Finetuning Schedule
6161
#
62-
# Schedule definition is facilitated via the [gen_ft_schedule](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts_supporters.html#finetuning_scheduler.fts_supporters.SchedulingMixin.gen_ft_schedule) method which dumps a default finetuning schedule (by default using a naive, 2-parameters per level heuristic) which can be adjusted as
62+
# Schedule definition is facilitated via the [gen_ft_schedule](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts_supporters.html#finetuning_scheduler.fts_supporters.ScheduleImplMixin.gen_ft_schedule) method which dumps a default finetuning schedule (by default using a naive, 2-parameters per level heuristic) which can be adjusted as
6363
# desired by the user and/or subsequently passed to the callback. Using the default/implicitly generated schedule will likely be less computationally efficient than a user-defined finetuning schedule but is useful for exploring a model's finetuning behavior and can serve as a good baseline for subsequent explicit schedule refinement.
6464
# While the current version of [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) only supports single optimizer and (optional) lr_scheduler configurations, per-phase maximum learning rates can be set as demonstrated in the next section.
6565

0 commit comments

Comments
 (0)