Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in / Register
Toggle navigation
S
stable-diffusion-webui
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Administrator
stable-diffusion-webui
Commits
a6d593a6
Commit
a6d593a6
authored
Oct 20, 2022
by
Melan
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Fixed a typo in a variable
parent
29e74d6e
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
5 deletions
+5
-5
textual_inversion.py
modules/textual_inversion/textual_inversion.py
+5
-5
No files found.
modules/textual_inversion/textual_inversion.py
View file @
a6d593a6
...
@@ -260,11 +260,11 @@ def train_embedding(embedding_name, learn_rate, batch_size, data_root, log_direc
...
@@ -260,11 +260,11 @@ def train_embedding(embedding_name, learn_rate, batch_size, data_root, log_direc
last_saved_image
=
"<none>"
last_saved_image
=
"<none>"
embedding_yet_to_be_embedded
=
False
embedding_yet_to_be_embedded
=
False
i
t
itial_step
=
embedding
.
step
or
0
i
n
itial_step
=
embedding
.
step
or
0
if
i
t
itial_step
>
steps
:
if
i
n
itial_step
>
steps
:
return
embedding
,
filename
return
embedding
,
filename
scheduler
=
LearnRateScheduler
(
learn_rate
,
steps
,
i
t
itial_step
)
scheduler
=
LearnRateScheduler
(
learn_rate
,
steps
,
i
n
itial_step
)
optimizer
=
torch
.
optim
.
AdamW
([
embedding
.
vec
],
lr
=
scheduler
.
learn_rate
)
optimizer
=
torch
.
optim
.
AdamW
([
embedding
.
vec
],
lr
=
scheduler
.
learn_rate
)
if
shared
.
opts
.
training_enable_tensorboard
:
if
shared
.
opts
.
training_enable_tensorboard
:
...
@@ -273,9 +273,9 @@ def train_embedding(embedding_name, learn_rate, batch_size, data_root, log_direc
...
@@ -273,9 +273,9 @@ def train_embedding(embedding_name, learn_rate, batch_size, data_root, log_direc
log_dir
=
os
.
path
.
join
(
log_directory
,
"tensorboard"
),
log_dir
=
os
.
path
.
join
(
log_directory
,
"tensorboard"
),
flush_secs
=
shared
.
opts
.
training_tensorboard_flush_every
)
flush_secs
=
shared
.
opts
.
training_tensorboard_flush_every
)
pbar
=
tqdm
.
tqdm
(
enumerate
(
ds
),
total
=
steps
-
i
t
itial_step
)
pbar
=
tqdm
.
tqdm
(
enumerate
(
ds
),
total
=
steps
-
i
n
itial_step
)
for
i
,
entries
in
pbar
:
for
i
,
entries
in
pbar
:
embedding
.
step
=
i
+
i
t
itial_step
embedding
.
step
=
i
+
i
n
itial_step
scheduler
.
apply
(
optimizer
,
embedding
.
step
)
scheduler
.
apply
(
optimizer
,
embedding
.
step
)
if
scheduler
.
finished
:
if
scheduler
.
finished
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment