Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in / Register
Toggle navigation
S
stable-diffusion-webui
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Administrator
stable-diffusion-webui
Commits
143ed5a4
Unverified
Commit
143ed5a4
authored
Jan 06, 2023
by
AUTOMATIC1111
Committed by
GitHub
Jan 06, 2023
Browse files
Options
Browse Files
Download
Plain Diff
Merge pull request #6384 from faber6/loads-ti-from-subdirs
allow loading embeddings from subdirectories
parents
8a13afd2
81133d41
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
12 additions
and
11 deletions
+12
-11
textual_inversion.py
modules/textual_inversion/textual_inversion.py
+12
-11
No files found.
modules/textual_inversion/textual_inversion.py
View file @
143ed5a4
...
...
@@ -149,19 +149,20 @@ class EmbeddingDatabase:
else
:
self
.
skipped_embeddings
[
name
]
=
embedding
for
fn
in
os
.
listdir
(
self
.
embeddings_dir
):
try
:
fullfn
=
os
.
path
.
join
(
self
.
embeddings_dir
,
fn
)
if
os
.
stat
(
fullfn
)
.
st_size
==
0
:
for
root
,
dirs
,
fns
in
os
.
walk
(
self
.
embeddings_dir
):
for
fn
in
fns
:
try
:
fullfn
=
os
.
path
.
join
(
root
,
fn
)
if
os
.
stat
(
fullfn
)
.
st_size
==
0
:
continue
process_file
(
fullfn
,
fn
)
except
Exception
:
print
(
f
"Error loading embedding {fn}:"
,
file
=
sys
.
stderr
)
print
(
traceback
.
format_exc
(),
file
=
sys
.
stderr
)
continue
process_file
(
fullfn
,
fn
)
except
Exception
:
print
(
f
"Error loading embedding {fn}:"
,
file
=
sys
.
stderr
)
print
(
traceback
.
format_exc
(),
file
=
sys
.
stderr
)
continue
print
(
f
"Textual inversion embeddings loaded({len(self.word_embeddings)}): {', '.join(self.word_embeddings.keys())}"
)
if
len
(
self
.
skipped_embeddings
)
>
0
:
print
(
f
"Textual inversion embeddings skipped({len(self.skipped_embeddings)}): {', '.join(self.skipped_embeddings.keys())}"
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment