Compare commits
688 Commits
3.31.12
...
feat/manag
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
2866193baf | ||
|
|
1ab2b1aeb3 | ||
|
|
ffaeb6d3ff | ||
|
|
6cc1ad4cc0 | ||
|
|
27fc787294 | ||
|
|
d23286d390 | ||
|
|
7c3ccc76c3 | ||
|
|
892dc5d4f3 | ||
|
|
e278692749 | ||
|
|
8d77dd2246 | ||
|
|
14ede2a585 | ||
|
|
5b525622f1 | ||
|
|
a24b11905c | ||
|
|
5d70858341 | ||
|
|
3daa006741 | ||
|
|
0bcc0c2101 | ||
|
|
b8850c808c | ||
|
|
f4f2c01ac1 | ||
|
|
7072e82dff | ||
|
|
53dc36c4cf | ||
|
|
5aadc3af00 | ||
|
|
8c28a698ed | ||
|
|
5ed6d8b202 | ||
|
|
b73dc7bf5e | ||
|
|
d7799964de | ||
|
|
71d0f4ab63 | ||
|
|
d479dcde81 | ||
|
|
ae536017d5 | ||
|
|
67ddfce279 | ||
|
|
b1f39b34d7 | ||
|
|
6cf958ccce | ||
|
|
5378f0a8e9 | ||
|
|
e13bf68775 | ||
|
|
eaed3677d3 | ||
|
|
b9c88da54d | ||
|
|
104ae77f7a | ||
|
|
bfcb2ce61b | ||
|
|
d970fe68ea | ||
|
|
63ba5fed09 | ||
|
|
98a8464933 | ||
|
|
7e3e6726e0 | ||
|
|
09567b2bb2 | ||
|
|
f3bd116184 | ||
|
|
7509737563 | ||
|
|
cfb815d879 | ||
|
|
44241fb967 | ||
|
|
c4b45129bd | ||
|
|
70741008ca | ||
|
|
6c2d2cae2a | ||
|
|
28f13d3311 | ||
|
|
4e31aaa8fb | ||
|
|
ba99f0c2cc | ||
|
|
e0a96b4937 | ||
|
|
82c055f527 | ||
|
|
f94008192c | ||
|
|
3895d5279e | ||
|
|
41be94690f | ||
|
|
3d85ecc525 | ||
|
|
7da00796e5 | ||
|
|
6086419cb6 | ||
|
|
5bc1f2f2c0 | ||
|
|
32a83b211e | ||
|
|
bead7b3a7f | ||
|
|
815d6d6572 | ||
|
|
fbecbee4c3 | ||
|
|
b9a7d2a78c | ||
|
|
95ce812992 | ||
|
|
9a36f4748c | ||
|
|
50b7849a35 | ||
|
|
6f1245b27c | ||
|
|
cc87ed3899 | ||
|
|
1d9037fefe | ||
|
|
03016e2d16 | ||
|
|
bdfb70a58a | ||
|
|
3d41617f4e | ||
|
|
35151ffdd1 | ||
|
|
4527d41a7a | ||
|
|
553cba12f3 | ||
|
|
00fb9c88e1 | ||
|
|
116e068ac3 | ||
|
|
1010dd2d28 | ||
|
|
68bc8302fd | ||
|
|
596dad5cda | ||
|
|
a924c280fb | ||
|
|
7354242906 | ||
|
|
3d0bcf5979 | ||
|
|
e7d0b158e9 | ||
|
|
10ff90787c | ||
|
|
330c4657b1 | ||
|
|
72a109f109 | ||
|
|
cf45c51dfb | ||
|
|
0b013adb34 | ||
|
|
7457d91f64 | ||
|
|
7fe1159426 | ||
|
|
c2665e3677 | ||
|
|
d63de803a4 | ||
|
|
11aca3513c | ||
|
|
561c9f40e5 | ||
|
|
54ed13aadf | ||
|
|
109cc21337 | ||
|
|
7e46b30fa5 | ||
|
|
0ba112c2c7 | ||
|
|
fc15d94170 | ||
|
|
dcb37d9c55 | ||
|
|
755b9d6342 | ||
|
|
3d6151c94f | ||
|
|
590bd8c4b9 | ||
|
|
e99aafd876 | ||
|
|
1f0adf8bcf | ||
|
|
dbd5d5fb43 | ||
|
|
a8b0e3641b | ||
|
|
9efb350be9 | ||
|
|
8d9820b3fb | ||
|
|
103f89551a | ||
|
|
6030d961ad | ||
|
|
ee08c9e17f | ||
|
|
48dd9a3240 | ||
|
|
e122e206a6 | ||
|
|
398b905758 | ||
|
|
dc2ec08fe3 | ||
|
|
3bf5edf5c9 | ||
|
|
134bca526c | ||
|
|
3393e58b06 | ||
|
|
648d7e73c6 | ||
|
|
eab6cdeee4 | ||
|
|
e8ec1ce8e3 | ||
|
|
b3581564ed | ||
|
|
29e1bd95fd | ||
|
|
8bff401c14 | ||
|
|
41798e9255 | ||
|
|
9e4f0228d1 | ||
|
|
76ee93c98c | ||
|
|
fb1a89efb7 | ||
|
|
aface43554 | ||
|
|
a35f0157b2 | ||
|
|
9b32162906 | ||
|
|
21bba62572 | ||
|
|
302327d6b3 | ||
|
|
5667e8bcbb | ||
|
|
ae66bd0e31 | ||
|
|
48dfadc02d | ||
|
|
3df6272bb6 | ||
|
|
e7f9bcda01 | ||
|
|
205044ca66 | ||
|
|
d497eb1f00 | ||
|
|
4e6f970ee9 | ||
|
|
0b6cdda6f5 | ||
|
|
a896ded763 | ||
|
|
fb5dd9ebc2 | ||
|
|
c8b7db6c38 | ||
|
|
44a3191be3 | ||
|
|
b4f7cdc9e7 | ||
|
|
8da07018d5 | ||
|
|
0c19a27065 | ||
|
|
3296b0ecdf | ||
|
|
0a07261124 | ||
|
|
33106d0ecf | ||
|
|
5bb887206a | ||
|
|
b30b0e27cb | ||
|
|
363736489c | ||
|
|
8dbf5e87a0 | ||
|
|
0b30f2cb50 | ||
|
|
ba5265dac4 | ||
|
|
ecb9c65917 | ||
|
|
8a98474600 | ||
|
|
b072216e67 | ||
|
|
cfb3181716 | ||
|
|
ab684cdc99 | ||
|
|
facadc3a44 | ||
|
|
f599bc22d7 | ||
|
|
281319d2da | ||
|
|
5cb203685c | ||
|
|
300c6e7406 | ||
|
|
9c4d6a0773 | ||
|
|
01fa37900b | ||
|
|
edbe744e17 | ||
|
|
2a32a1a4a8 | ||
|
|
404bdb21e6 | ||
|
|
b260c9a512 | ||
|
|
4b941adb6a | ||
|
|
bd752550a8 | ||
|
|
b8b71bb961 | ||
|
|
5aaf7a4092 | ||
|
|
030e02ffb8 | ||
|
|
60746c6253 | ||
|
|
d962aa03f4 | ||
|
|
121a5a1888 | ||
|
|
9e4a2aae43 | ||
|
|
ee6eb685e7 | ||
|
|
09a38a32ce | ||
|
|
d13b19d43d | ||
|
|
5316ec1b4d | ||
|
|
e730dca1ad | ||
|
|
8da30640bb | ||
|
|
6f4eb88e07 | ||
|
|
d9592b9dab | ||
|
|
b87ada72aa | ||
|
|
83363ba1f0 | ||
|
|
a2a7349ce4 | ||
|
|
23ebe7f718 | ||
|
|
e04264cfa3 | ||
|
|
8d29e5037f | ||
|
|
6926ed45b0 | ||
|
|
736b85b8bb | ||
|
|
9e3361bc31 | ||
|
|
6e10381020 | ||
|
|
a1d37d379c | ||
|
|
07d87db7a2 | ||
|
|
4e556673d2 | ||
|
|
f421304fc1 | ||
|
|
6867616973 | ||
|
|
c9271b1686 | ||
|
|
12eb6863da | ||
|
|
4834874091 | ||
|
|
8759ebf200 | ||
|
|
d4715aebef | ||
|
|
0fe2ade7bb | ||
|
|
0c71565535 | ||
|
|
cf8029ecd4 | ||
|
|
6a637091a2 | ||
|
|
31eba60012 | ||
|
|
51e58e9078 | ||
|
|
4a1e76730a | ||
|
|
5599bb028b | ||
|
|
552c6da0cc | ||
|
|
cc6817a891 | ||
|
|
fb48d1b485 | ||
|
|
1c336dad6b | ||
|
|
a4940d46cd | ||
|
|
499b2f44c1 | ||
|
|
2b200c9281 | ||
|
|
36a900c98f | ||
|
|
5236b03f66 | ||
|
|
8be35e3621 | ||
|
|
509f00fe89 | ||
|
|
a98b87f148 | ||
|
|
ae9b2b3b72 | ||
|
|
02e1ec0ae3 | ||
|
|
daefb0f120 | ||
|
|
ff0604e3b6 | ||
|
|
20e41e22fa | ||
|
|
59264c1fd9 | ||
|
|
a0e3bdd594 | ||
|
|
6580aaf3ad | ||
|
|
0b46701b60 | ||
|
|
0bb4effede | ||
|
|
b07082a52d | ||
|
|
04f267f5a7 | ||
|
|
03ccce2804 | ||
|
|
e894bd9f24 | ||
|
|
10e6988273 | ||
|
|
905b61e5d8 | ||
|
|
ee69d393ae | ||
|
|
cab39973ae | ||
|
|
d93f5d07bb | ||
|
|
ba00ffe1ae | ||
|
|
6afaf5eaf5 | ||
|
|
d30459cc34 | ||
|
|
e92fbb7b1b | ||
|
|
42d464b532 | ||
|
|
c2e9e5c63a | ||
|
|
bc36726925 | ||
|
|
22725b0188 | ||
|
|
7abbff8c31 | ||
|
|
6236f4bcf4 | ||
|
|
3c3e80f77f | ||
|
|
4aae2fb289 | ||
|
|
66ff07752f | ||
|
|
5cf92f2742 | ||
|
|
6d3fddc474 | ||
|
|
66d4ad6174 | ||
|
|
2a366a1607 | ||
|
|
d87a0995b4 | ||
|
|
9a73a41e04 | ||
|
|
ba041b36bc | ||
|
|
f5f9de69b4 | ||
|
|
71e56c62e8 | ||
|
|
a0b0c2b963 | ||
|
|
0f496619fd | ||
|
|
5fdd6a441a | ||
|
|
00f287bb63 | ||
|
|
785268efa6 | ||
|
|
2c976d9394 | ||
|
|
1e32582642 | ||
|
|
6f8f6d07f5 | ||
|
|
3958111e76 | ||
|
|
86fcc4af74 | ||
|
|
2fd26756df | ||
|
|
478f4b74d8 | ||
|
|
73d0d2a1bb | ||
|
|
546db08ec4 | ||
|
|
0dd41a8670 | ||
|
|
82c0c89f46 | ||
|
|
f4ce0fd5f1 | ||
|
|
c3798bf4c2 | ||
|
|
ff80b6ccb0 | ||
|
|
e729217116 | ||
|
|
94c695daca | ||
|
|
9f189f0420 | ||
|
|
ad09e53f60 | ||
|
|
092a7a5f3f | ||
|
|
f45649bd25 | ||
|
|
2595cc5ed7 | ||
|
|
2f62190c6f | ||
|
|
577314984c | ||
|
|
f0346b955b | ||
|
|
70139ded4a | ||
|
|
bf379900e1 | ||
|
|
9bafc90f5e | ||
|
|
fce0d9e88e | ||
|
|
2b3b154989 | ||
|
|
948d2440a1 | ||
|
|
5adbe1ce7a | ||
|
|
8157d34ffa | ||
|
|
3ec8cb2204 | ||
|
|
0daa826543 | ||
|
|
a66028da58 | ||
|
|
807c9e6872 | ||
|
|
e71f3774ba | ||
|
|
dd7314bf10 | ||
|
|
f33bc127dc | ||
|
|
db92b87782 | ||
|
|
eba41c8693 | ||
|
|
c855308162 | ||
|
|
73d971bed8 | ||
|
|
bcfe0c2874 | ||
|
|
931ff666ae | ||
|
|
18b6d86cc4 | ||
|
|
086040f858 | ||
|
|
adbeb527d6 | ||
|
|
043176168d | ||
|
|
3c5efa0662 | ||
|
|
9b739bcbbf | ||
|
|
db89076e48 | ||
|
|
19b341ef18 | ||
|
|
be3713b1a3 | ||
|
|
99c4415cfb | ||
|
|
7b311f2ccf | ||
|
|
4aeabfe0a7 | ||
|
|
431ed02194 | ||
|
|
07f587ed83 | ||
|
|
0408341d82 | ||
|
|
5b3c9432f3 | ||
|
|
4a197e63f9 | ||
|
|
ad79a2ef45 | ||
|
|
0876a12fe9 | ||
|
|
c43c7ecc03 | ||
|
|
4a6dee3044 | ||
|
|
019acdd840 | ||
|
|
1c98512720 | ||
|
|
43041cebed | ||
|
|
23a09ad546 | ||
|
|
0836e8fe7c | ||
|
|
90196af8f8 | ||
|
|
002e549a86 | ||
|
|
1de6f859bf | ||
|
|
566fe05772 | ||
|
|
18772c6292 | ||
|
|
6278bddc9b | ||
|
|
f74bf71735 | ||
|
|
efe9ed68b2 | ||
|
|
7c1e75865d | ||
|
|
89530fc4e7 | ||
|
|
a0aee41f1a | ||
|
|
2049dd75f4 | ||
|
|
0864c35ba9 | ||
|
|
92c9f66671 | ||
|
|
223d6dad51 | ||
|
|
815784e809 | ||
|
|
2795d00d1e | ||
|
|
86dd0b4963 | ||
|
|
77a4f4819f | ||
|
|
b63d603482 | ||
|
|
e569b4e613 | ||
|
|
8a70997546 | ||
|
|
80d0a0f882 | ||
|
|
70b3997874 | ||
|
|
e8e4311068 | ||
|
|
cb0fa5829d | ||
|
|
a66f86d4af | ||
|
|
35d98dcea8 | ||
|
|
38fefde06d | ||
|
|
75ecb31f8c | ||
|
|
77133375ad | ||
|
|
c58b93ff51 | ||
|
|
7d8ebfe91b | ||
|
|
810381eab2 | ||
|
|
61dc6cf2de | ||
|
|
0205ebad2a | ||
|
|
09a94133ac | ||
|
|
1eb3c3b219 | ||
|
|
457845bb51 | ||
|
|
0c11b46585 | ||
|
|
c35100d9e9 | ||
|
|
847031cb04 | ||
|
|
d1ca6288a3 | ||
|
|
624ad4cfe6 | ||
|
|
f8d87bb452 | ||
|
|
f60b3505e0 | ||
|
|
addefbc511 | ||
|
|
c4314b25a3 | ||
|
|
921bb86127 | ||
|
|
d912fb0f8b | ||
|
|
e8fc053a32 | ||
|
|
ce3b2bab39 | ||
|
|
15e3699535 | ||
|
|
a4bf6bddbf | ||
|
|
f1b3c6b735 | ||
|
|
e923434d08 | ||
|
|
ddc9cd0fd5 | ||
|
|
d081db0c30 | ||
|
|
14298b0859 | ||
|
|
03ecda3cfe | ||
|
|
350cb767c3 | ||
|
|
f450dcbb57 | ||
|
|
32e003965a | ||
|
|
65f0764338 | ||
|
|
1bdb026079 | ||
|
|
b3a7fb9c3e | ||
|
|
c143c81a7e | ||
|
|
dd389ba0f8 | ||
|
|
46b1649ab8 | ||
|
|
89710412e4 | ||
|
|
931973b632 | ||
|
|
60aaa838e3 | ||
|
|
7e51286313 | ||
|
|
1246538bbb | ||
|
|
80518abf9d | ||
|
|
fc1ae2a18e | ||
|
|
3fd8d2049c | ||
|
|
35a6bcf20c | ||
|
|
0d75fc331e | ||
|
|
0a23e793e3 | ||
|
|
2c1c03e063 | ||
|
|
64059d2949 | ||
|
|
648aa7c4d3 | ||
|
|
c888ea6435 | ||
|
|
b089db79c5 | ||
|
|
7a73f5db73 | ||
|
|
a96e7b114e | ||
|
|
0148b5a3cc | ||
|
|
2120a0aa79 | ||
|
|
706b6d8317 | ||
|
|
a59e6e176e | ||
|
|
1d575fb654 | ||
|
|
98af8dc849 | ||
|
|
4d89c69109 | ||
|
|
b73dc6121f | ||
|
|
b55e1404b1 | ||
|
|
0be0a2e6d7 | ||
|
|
3afafdb884 | ||
|
|
884b503728 | ||
|
|
7f1ebbe081 | ||
|
|
c8882dcb7c | ||
|
|
601f1bf452 | ||
|
|
274bb81a08 | ||
|
|
e2c90b4681 | ||
|
|
fa0a98ac6e | ||
|
|
e6e7b42415 | ||
|
|
0b7ef2e1d4 | ||
|
|
2fac67a9f9 | ||
|
|
8b9892de2e | ||
|
|
b3290dc909 | ||
|
|
3e3176eddb | ||
|
|
b1ef84894a | ||
|
|
c6cffc92c4 | ||
|
|
efb9fd2712 | ||
|
|
94b294ff93 | ||
|
|
99a9e33648 | ||
|
|
055d94a919 | ||
|
|
0978005240 | ||
|
|
1f796581ec | ||
|
|
f3a1716dad | ||
|
|
a1c3a0db1f | ||
|
|
9f80cc8a6b | ||
|
|
133786846e | ||
|
|
bdf297a5c6 | ||
|
|
6767254eb0 | ||
|
|
691cebd479 | ||
|
|
f3932cbf29 | ||
|
|
3f73a97037 | ||
|
|
226f1f5be4 | ||
|
|
7e45c07660 | ||
|
|
0c815036b9 | ||
|
|
3870abfd2d | ||
|
|
ae9fdd0255 | ||
|
|
b3874ee6fd | ||
|
|
62af4891f3 | ||
|
|
2176e0c0ad | ||
|
|
cac105b0d5 | ||
|
|
cd7c42cc23 | ||
|
|
a3fb847773 | ||
|
|
5c2f4f9e4b | ||
|
|
0a511d5b87 | ||
|
|
efe1aad5db | ||
|
|
eed4c53df0 | ||
|
|
9c08a6314b | ||
|
|
a6b2d2c722 | ||
|
|
3c6b5300e5 | ||
|
|
f084c30b20 | ||
|
|
206004fc1f | ||
|
|
d9641cbff8 | ||
|
|
13b272052a | ||
|
|
c79e0d26d8 | ||
|
|
ec4a4c2cfc | ||
|
|
9a9491bff9 | ||
|
|
5b5155819f | ||
|
|
1b941c6b29 | ||
|
|
9b9665d2e9 | ||
|
|
4cceb46641 | ||
|
|
19cf83cce6 | ||
|
|
bb60d399fc | ||
|
|
1a9f1dd0ae | ||
|
|
586c465aaa | ||
|
|
50ceb974d9 | ||
|
|
27cf40d392 | ||
|
|
bbb6005634 | ||
|
|
8dbd996558 | ||
|
|
8605345499 | ||
|
|
8303e7c043 | ||
|
|
3671ddbd4b | ||
|
|
5bc1ceacb2 | ||
|
|
47b9fa3651 | ||
|
|
6062b87771 | ||
|
|
213152aa43 | ||
|
|
ea8047344f | ||
|
|
a7bc167d53 | ||
|
|
18e78ee2c2 | ||
|
|
754236e35b | ||
|
|
2645d62991 | ||
|
|
e55d9416dc | ||
|
|
24d35eec54 | ||
|
|
ee053f50b4 | ||
|
|
3593c9ed3e | ||
|
|
93f548696d | ||
|
|
cecb952add | ||
|
|
596571bb38 | ||
|
|
85a6fb75b8 | ||
|
|
7dea42433b | ||
|
|
ec5e4af6b7 | ||
|
|
0048754fe8 | ||
|
|
5c0bd0f79c | ||
|
|
669cdffe08 | ||
|
|
3cd553301b | ||
|
|
db7ef4f253 | ||
|
|
a09704567c | ||
|
|
21fe577a2e | ||
|
|
9f258f5c9c | ||
|
|
9cd088feb0 | ||
|
|
89e3828138 | ||
|
|
731c89dc27 | ||
|
|
3d920cab4d | ||
|
|
470b8c1fb8 | ||
|
|
dbf988fd5a | ||
|
|
0031743ad4 | ||
|
|
0f2c0ab65d | ||
|
|
53244b794f | ||
|
|
416122d61d | ||
|
|
d3c625e791 | ||
|
|
ca2c41783c | ||
|
|
e2a6446585 | ||
|
|
839790b5ab | ||
|
|
58b9946936 | ||
|
|
a19ba22eaf | ||
|
|
117715aa22 | ||
|
|
891a5a85ee | ||
|
|
35464654c1 | ||
|
|
ec9d52d482 | ||
|
|
166debfabb | ||
|
|
7258a09fe5 | ||
|
|
058a436187 | ||
|
|
1950802c55 | ||
|
|
eb52a03372 | ||
|
|
f8aa428be3 | ||
|
|
ec0893f136 | ||
|
|
92b99ea963 | ||
|
|
02cd52bb65 | ||
|
|
af1ec2c87b | ||
|
|
41006c3a33 | ||
|
|
116a6d500d | ||
|
|
87d0ac807f | ||
|
|
fc943172eb | ||
|
|
9daa5a2fbd | ||
|
|
b7b2746a61 | ||
|
|
d66a4fbfc8 | ||
|
|
683a172ad8 | ||
|
|
6e12358f5a | ||
|
|
8bcf16dc90 | ||
|
|
65c0a2a1f5 | ||
|
|
115236eb9c | ||
|
|
08de942abe | ||
|
|
e9dff83290 | ||
|
|
3bc6c7584d | ||
|
|
22a2bf1584 | ||
|
|
79ece5f72c | ||
|
|
5da6fe1373 | ||
|
|
48c10d0b95 | ||
|
|
9bb56b1457 | ||
|
|
83420fd828 | ||
|
|
52f4b9506f | ||
|
|
b501e9b20b | ||
|
|
1f7ae5319a | ||
|
|
68c201239d | ||
|
|
6e4e43f612 | ||
|
|
81c3708f39 | ||
|
|
f4d2bbde34 | ||
|
|
d14b42a42c | ||
|
|
0e9c32344c | ||
|
|
30c4ea06af | ||
|
|
8211264993 | ||
|
|
67cf5b49e1 | ||
|
|
90ce448380 | ||
|
|
8e7ba18e05 | ||
|
|
8359e1063e | ||
|
|
ca078e54b9 | ||
|
|
f7e930c5a2 | ||
|
|
479d95e1c8 | ||
|
|
2b0ff08eef | ||
|
|
67a487db15 | ||
|
|
2488cb3458 | ||
|
|
157e6336fa | ||
|
|
d808a1f406 | ||
|
|
2bb4d8cd63 | ||
|
|
a8164e1631 | ||
|
|
a31d286945 | ||
|
|
12eeef4cf0 | ||
|
|
ce8e6dc36e | ||
|
|
7a32e544a7 | ||
|
|
e16e9d7a0e | ||
|
|
821f908dbc | ||
|
|
e007e6f897 | ||
|
|
94f496fd65 | ||
|
|
d2ce35d2e6 | ||
|
|
2eeebb32dc | ||
|
|
f6d636d82f | ||
|
|
56125839ac | ||
|
|
0cd397623e | ||
|
|
5978b6c9ee | ||
|
|
9e132811bc | ||
|
|
cd49799bed | ||
|
|
d547a05106 | ||
|
|
3a3b5c1f92 | ||
|
|
26be01ff82 | ||
|
|
8f6dd92374 | ||
|
|
d50b71a887 | ||
|
|
3bc9cbc767 | ||
|
|
b6f6b4fd8a | ||
|
|
a66bada8a3 | ||
|
|
db0b57a14c | ||
|
|
2048ac87a9 | ||
|
|
9adf6de850 | ||
|
|
7657c7866f | ||
|
|
d638f75117 | ||
|
|
a804f7de19 | ||
|
|
efff6b2c18 | ||
|
|
0c46434164 | ||
|
|
0bb8947c02 | ||
|
|
09e8e8798c | ||
|
|
abfd85602e | ||
|
|
1816bb748e | ||
|
|
05ceab68f8 | ||
|
|
46a37907e6 | ||
|
|
7fc8ba587e | ||
|
|
7a35bd9d9a | ||
|
|
a76ef49d2d | ||
|
|
bb0fcf6ea6 | ||
|
|
539e0a1534 | ||
|
|
aaae6ce304 | ||
|
|
3c413840d7 | ||
|
|
29ca93fcb4 | ||
|
|
9dc8e630a0 | ||
|
|
10105ad737 | ||
|
|
5738ea861a | ||
|
|
dbd25b0f0a | ||
|
|
0de9d36c28 | ||
|
|
05f1a8ab0d | ||
|
|
5ce170b7ce | ||
|
|
2b47aad076 | ||
|
|
b4dc59331d | ||
|
|
81e84fad78 | ||
|
|
42e8a959dd | ||
|
|
208ca31836 | ||
|
|
a128baf894 | ||
|
|
57b847eebf | ||
|
|
149257e4f1 | ||
|
|
212b8e7ed2 | ||
|
|
01ac9c895a | ||
|
|
ebcb14e6aa |
1
.env.example
Normal file
1
.env.example
Normal file
@@ -0,0 +1 @@
|
||||
PYPI_TOKEN=your-pypi-token
|
||||
70
.github/workflows/ci.yml
vendored
Normal file
70
.github/workflows/ci.yml
vendored
Normal file
@@ -0,0 +1,70 @@
|
||||
name: CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, feat/*, fix/* ]
|
||||
pull_request:
|
||||
branches: [ main ]
|
||||
|
||||
jobs:
|
||||
validate-openapi:
|
||||
name: Validate OpenAPI Specification
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Check if OpenAPI changed
|
||||
id: openapi-changed
|
||||
uses: tj-actions/changed-files@v44
|
||||
with:
|
||||
files: openapi.yaml
|
||||
|
||||
- name: Setup Node.js
|
||||
if: steps.openapi-changed.outputs.any_changed == 'true'
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '18'
|
||||
|
||||
- name: Install Redoc CLI
|
||||
if: steps.openapi-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
npm install -g @redocly/cli
|
||||
|
||||
- name: Validate OpenAPI specification
|
||||
if: steps.openapi-changed.outputs.any_changed == 'true'
|
||||
run: |
|
||||
redocly lint openapi.yaml
|
||||
|
||||
code-quality:
|
||||
name: Code Quality Checks
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Fetch all history for proper diff
|
||||
|
||||
- name: Get changed Python files
|
||||
id: changed-py-files
|
||||
uses: tj-actions/changed-files@v44
|
||||
with:
|
||||
files: |
|
||||
**/*.py
|
||||
files_ignore: |
|
||||
comfyui_manager/legacy/**
|
||||
|
||||
- name: Setup Python
|
||||
if: steps.changed-py-files.outputs.any_changed == 'true'
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.9'
|
||||
|
||||
- name: Install dependencies
|
||||
if: steps.changed-py-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
pip install ruff
|
||||
|
||||
- name: Run ruff linting on changed files
|
||||
if: steps.changed-py-files.outputs.any_changed == 'true'
|
||||
run: |
|
||||
echo "Changed files: ${{ steps.changed-py-files.outputs.all_changed_files }}"
|
||||
echo "${{ steps.changed-py-files.outputs.all_changed_files }}" | xargs -r ruff check
|
||||
58
.github/workflows/publish-to-pypi.yml
vendored
Normal file
58
.github/workflows/publish-to-pypi.yml
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
name: Publish to PyPI
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
push:
|
||||
branches:
|
||||
- manager-v4
|
||||
paths:
|
||||
- "pyproject.toml"
|
||||
|
||||
jobs:
|
||||
build-and-publish:
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.repository_owner == 'ltdrdata' || github.repository_owner == 'Comfy-Org' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.x'
|
||||
|
||||
- name: Install build dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
python -m pip install build twine
|
||||
|
||||
- name: Get current version
|
||||
id: current_version
|
||||
run: |
|
||||
CURRENT_VERSION=$(grep -oP '^version = "\K[^"]+' pyproject.toml)
|
||||
echo "version=$CURRENT_VERSION" >> $GITHUB_OUTPUT
|
||||
echo "Current version: $CURRENT_VERSION"
|
||||
|
||||
- name: Build package
|
||||
run: python -m build
|
||||
|
||||
# - name: Create GitHub Release
|
||||
# id: create_release
|
||||
# uses: softprops/action-gh-release@v2
|
||||
# env:
|
||||
# GITHUB_TOKEN: ${{ github.token }}
|
||||
# with:
|
||||
# files: dist/*
|
||||
# tag_name: v${{ steps.current_version.outputs.version }}
|
||||
# draft: false
|
||||
# prerelease: false
|
||||
# generate_release_notes: true
|
||||
|
||||
- name: Publish to PyPI
|
||||
uses: pypa/gh-action-pypi-publish@76f52bc884231f62b9a034ebfe128415bbaabdfc
|
||||
with:
|
||||
password: ${{ secrets.PYPI_TOKEN }}
|
||||
skip-existing: true
|
||||
verbose: true
|
||||
25
.github/workflows/publish.yml
vendored
25
.github/workflows/publish.yml
vendored
@@ -1,25 +0,0 @@
|
||||
name: Publish to Comfy registry
|
||||
on:
|
||||
workflow_dispatch:
|
||||
push:
|
||||
branches:
|
||||
- main-blocked
|
||||
paths:
|
||||
- "pyproject.toml"
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
|
||||
jobs:
|
||||
publish-node:
|
||||
name: Publish Custom Node to registry
|
||||
runs-on: ubuntu-latest
|
||||
if: ${{ github.repository_owner == 'ltdrdata' }}
|
||||
steps:
|
||||
- name: Check out code
|
||||
uses: actions/checkout@v4
|
||||
- name: Publish Custom Node
|
||||
uses: Comfy-Org/publish-node-action@v1
|
||||
with:
|
||||
## Add your own personal access token to your Github Repository secrets and reference it here.
|
||||
personal_access_token: ${{ secrets.REGISTRY_ACCESS_TOKEN }}
|
||||
6
.gitignore
vendored
6
.gitignore
vendored
@@ -17,4 +17,8 @@ github-stats-cache.json
|
||||
pip_overrides.json
|
||||
*.json
|
||||
check2.sh
|
||||
/venv/
|
||||
/venv/
|
||||
build
|
||||
dist
|
||||
*.egg-info
|
||||
.env
|
||||
47
CONTRIBUTING.md
Normal file
47
CONTRIBUTING.md
Normal file
@@ -0,0 +1,47 @@
|
||||
## Testing Changes
|
||||
|
||||
1. Activate the ComfyUI environment.
|
||||
|
||||
2. Build package locally after making changes.
|
||||
|
||||
```bash
|
||||
# from inside the ComfyUI-Manager directory, with the ComfyUI environment activated
|
||||
python -m build
|
||||
```
|
||||
|
||||
3. Install the package locally in the ComfyUI environment.
|
||||
|
||||
```bash
|
||||
# Uninstall existing package
|
||||
pip uninstall comfyui-manager
|
||||
|
||||
# Install the locale package
|
||||
pip install dist/comfyui-manager-*.whl
|
||||
```
|
||||
|
||||
4. Start ComfyUI.
|
||||
|
||||
```bash
|
||||
# after navigating to the ComfyUI directory
|
||||
python main.py
|
||||
```
|
||||
|
||||
## Manually Publish Test Version to PyPi
|
||||
|
||||
1. Set the `PYPI_TOKEN` environment variable in env file.
|
||||
|
||||
2. If manually publishing, you likely want to use a release candidate version, so set the version in [pyproject.toml](pyproject.toml) to something like `0.0.1rc1`.
|
||||
|
||||
3. Build the package.
|
||||
|
||||
```bash
|
||||
python -m build
|
||||
```
|
||||
|
||||
4. Upload the package to PyPi.
|
||||
|
||||
```bash
|
||||
python -m twine upload dist/* --username __token__ --password $PYPI_TOKEN
|
||||
```
|
||||
|
||||
5. View at https://pypi.org/project/comfyui-manager/
|
||||
15
MANIFEST.in
Normal file
15
MANIFEST.in
Normal file
@@ -0,0 +1,15 @@
|
||||
include comfyui_manager/js/*
|
||||
include comfyui_manager/*.json
|
||||
include comfyui_manager/glob/*
|
||||
include LICENSE.txt
|
||||
include README.md
|
||||
include requirements.txt
|
||||
include pyproject.toml
|
||||
include custom-node-list.json
|
||||
include extension-node-list.json
|
||||
include extras.json
|
||||
include github-stats.json
|
||||
include model-list.json
|
||||
include alter-list.json
|
||||
include comfyui_manager/channels.list.template
|
||||
include comfyui_manager/pip-policy.json
|
||||
138
README.md
138
README.md
@@ -5,86 +5,35 @@
|
||||

|
||||
|
||||
## NOTICE
|
||||
* V4.0: Modify the structure to be installable via pip instead of using git clone.
|
||||
* V3.16: Support for `uv` has been added. Set `use_uv` in `config.ini`.
|
||||
* V3.10: `double-click feature` is removed
|
||||
* This feature has been moved to https://github.com/ltdrdata/comfyui-connection-helper
|
||||
* V3.3.2: Overhauled. Officially supports [https://comfyregistry.org/](https://comfyregistry.org/).
|
||||
* V3.3.2: Overhauled. Officially supports [https://registry.comfy.org/](https://registry.comfy.org/).
|
||||
* You can see whole nodes info on [ComfyUI Nodes Info](https://ltdrdata.github.io/) page.
|
||||
|
||||
## Installation
|
||||
|
||||
### Installation[method1] (General installation method: ComfyUI-Manager only)
|
||||
* When installing the latest ComfyUI, it will be automatically installed as a dependency, so manual installation is no longer necessary.
|
||||
|
||||
To install ComfyUI-Manager in addition to an existing installation of ComfyUI, you can follow the following steps:
|
||||
* Manual installation of the nightly version:
|
||||
* Clone to a temporary directory (**Note:** Do **not** clone into `ComfyUI/custom_nodes`.)
|
||||
```
|
||||
git clone https://github.com/Comfy-Org/ComfyUI-Manager
|
||||
```
|
||||
* Install via pip
|
||||
```
|
||||
cd ComfyUI-Manager
|
||||
pip install .
|
||||
```
|
||||
|
||||
1. goto `ComfyUI/custom_nodes` dir in terminal(cmd)
|
||||
2. `git clone https://github.com/ltdrdata/ComfyUI-Manager comfyui-manager`
|
||||
3. Restart ComfyUI
|
||||
|
||||
|
||||
### Installation[method2] (Installation for portable ComfyUI version: ComfyUI-Manager only)
|
||||
1. install git
|
||||
- https://git-scm.com/download/win
|
||||
- standalone version
|
||||
- select option: use windows default console window
|
||||
2. Download [scripts/install-manager-for-portable-version.bat](https://github.com/ltdrdata/ComfyUI-Manager/raw/main/scripts/install-manager-for-portable-version.bat) into installed `"ComfyUI_windows_portable"` directory
|
||||
- Don't click. Right click the link and use save as...
|
||||
3. double click `install-manager-for-portable-version.bat` batch file
|
||||
|
||||

|
||||
|
||||
|
||||
### Installation[method3] (Installation through comfy-cli: install ComfyUI and ComfyUI-Manager at once.)
|
||||
> RECOMMENDED: comfy-cli provides various features to manage ComfyUI from the CLI.
|
||||
|
||||
* **prerequisite: python 3, git**
|
||||
|
||||
Windows:
|
||||
```commandline
|
||||
python -m venv venv
|
||||
venv\Scripts\activate
|
||||
pip install comfy-cli
|
||||
comfy install
|
||||
```
|
||||
|
||||
Linux/OSX:
|
||||
```commandline
|
||||
python -m venv venv
|
||||
. venv/bin/activate
|
||||
pip install comfy-cli
|
||||
comfy install
|
||||
```
|
||||
* See also: https://github.com/Comfy-Org/comfy-cli
|
||||
|
||||
|
||||
### Installation[method4] (Installation for linux+venv: ComfyUI + ComfyUI-Manager)
|
||||
## Front-end
|
||||
|
||||
To install ComfyUI with ComfyUI-Manager on Linux using a venv environment, you can follow these steps:
|
||||
* **prerequisite: python-is-python3, python3-venv, git**
|
||||
|
||||
1. Download [scripts/install-comfyui-venv-linux.sh](https://github.com/ltdrdata/ComfyUI-Manager/raw/main/scripts/install-comfyui-venv-linux.sh) into empty install directory
|
||||
- Don't click. Right click the link and use save as...
|
||||
- ComfyUI will be installed in the subdirectory of the specified directory, and the directory will contain the generated executable script.
|
||||
2. `chmod +x install-comfyui-venv-linux.sh`
|
||||
3. `./install-comfyui-venv-linux.sh`
|
||||
|
||||
### Installation Precautions
|
||||
* **DO**: `ComfyUI-Manager` files must be accurately located in the path `ComfyUI/custom_nodes/comfyui-manager`
|
||||
* Installing in a compressed file format is not recommended.
|
||||
* **DON'T**: Decompress directly into the `ComfyUI/custom_nodes` location, resulting in the Manager contents like `__init__.py` being placed directly in that directory.
|
||||
* You have to remove all ComfyUI-Manager files from `ComfyUI/custom_nodes`
|
||||
* **DON'T**: In a form where decompression occurs in a path such as `ComfyUI/custom_nodes/ComfyUI-Manager/ComfyUI-Manager`.
|
||||
* **DON'T**: In a form where decompression occurs in a path such as `ComfyUI/custom_nodes/ComfyUI-Manager-main`.
|
||||
* In such cases, `ComfyUI-Manager` may operate, but it won't be recognized within `ComfyUI-Manager`, and updates cannot be performed. It also poses the risk of duplicate installations. Remove it and install properly via `git clone` method.
|
||||
|
||||
|
||||
You can execute ComfyUI by running either `./run_gpu.sh` or `./run_cpu.sh` depending on your system configuration.
|
||||
|
||||
## Colab Notebook
|
||||
This repository provides Colab notebooks that allow you to install and use ComfyUI, including ComfyUI-Manager. To use ComfyUI, [click on this link](https://colab.research.google.com/github/ltdrdata/ComfyUI-Manager/blob/main/notebooks/comfyui_colab_with_manager.ipynb).
|
||||
* Support for installing ComfyUI
|
||||
* Support for basic installation of ComfyUI-Manager
|
||||
* Support for automatically installing dependencies of custom nodes upon restarting Colab notebooks.
|
||||
* The built-in front-end of ComfyUI-Manager is the legacy front-end. The front-end for ComfyUI-Manager is now provided via [ComfyUI Frontend](https://github.com/Comfy-Org/ComfyUI_frontend).
|
||||
* To enable the legacy front-end, set the environment variable `ENABLE_LEGACY_COMFYUI_MANAGER_FRONT` to `true` before running.
|
||||
|
||||
|
||||
## How To Use
|
||||
@@ -266,13 +215,14 @@ The following settings are applied based on the section marked as `is_default`.
|
||||
downgrade_blacklist = <Set a list of packages to prevent downgrades. List them separated by commas.>
|
||||
security_level = <Set the security level => strong|normal|normal-|weak>
|
||||
always_lazy_install = <Whether to perform dependency installation on restart even in environments other than Windows.>
|
||||
network_mode = <Set the network mode => public|private|offline>
|
||||
network_mode = <Set the network mode => public|private|offline|personal_cloud>
|
||||
```
|
||||
|
||||
* network_mode:
|
||||
- public: An environment that uses a typical public network.
|
||||
- private: An environment that uses a closed network, where a private node DB is configured via `channel_url`. (Uses cache if available)
|
||||
- offline: An environment that does not use any external connections when using an offline network. (Uses cache if available)
|
||||
- personal_cloud: Applies relaxed security features in cloud environments such as Google Colab or Runpod, where strong security is not required.
|
||||
|
||||
|
||||
## Additional Feature
|
||||
@@ -363,31 +313,33 @@ When you run the `scan.sh` script:
|
||||
|
||||
|
||||
## Security policy
|
||||
* Edit `config.ini` file: add `security_level = <LEVEL>`
|
||||
* `strong`
|
||||
* doesn't allow `high` and `middle` level risky feature
|
||||
* `normal`
|
||||
* doesn't allow `high` level risky feature
|
||||
* `middle` level risky feature is available
|
||||
* `normal-`
|
||||
* doesn't allow `high` level risky feature if `--listen` is specified and not starts with `127.`
|
||||
* `middle` level risky feature is available
|
||||
* `weak`
|
||||
* all feature is available
|
||||
|
||||
* `high` level risky features
|
||||
* `Install via git url`, `pip install`
|
||||
* Installation of custom nodes registered not in the `default channel`.
|
||||
* Fix custom nodes
|
||||
|
||||
* `middle` level risky features
|
||||
* Uninstall/Update
|
||||
* Installation of custom nodes registered in the `default channel`.
|
||||
* Restore/Remove Snapshot
|
||||
* Restart
|
||||
|
||||
* `low` level risky features
|
||||
* Update ComfyUI
|
||||
|
||||
The security settings are applied based on whether the ComfyUI server's listener is non-local and whether the network mode is set to `personal_cloud`.
|
||||
|
||||
* **non-local**: When the server is launched with `--listen` and is bound to a network range other than the local `127.` range, allowing remote IP access.
|
||||
* **personal\_cloud**: When the `network_mode` is set to `personal_cloud`.
|
||||
|
||||
|
||||
### Risky Level Table
|
||||
|
||||
| Risky Level | features |
|
||||
|-------------|---------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| high+ | * `Install via git url`, `pip install`<BR>* Installation of nodepack registered not in the `default channel`. |
|
||||
| high | * Fix nodepack |
|
||||
| middle+ | * Uninstall/Update<BR>* Installation of nodepack registered in the `default channel`.<BR>* Restore/Remove Snapshot<BR>* Install model |
|
||||
| middle | * Restart |
|
||||
| low | * Update ComfyUI |
|
||||
|
||||
|
||||
### Security Level Table
|
||||
|
||||
| Security Level | local | non-local (personal_cloud) | non-local (not personal_cloud) |
|
||||
|----------------|--------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|--------------------------------|
|
||||
| strong | * Only `weak` level risky features are allowed | * Only `weak` level risky features are allowed | * Only `weak` level risky features are allowed |
|
||||
| normal | * `high+` and `high` level risky features are not allowed<BR>* `middle+` and `middle` level risky features are available | * `high+` and `high` level risky features are not allowed<BR>* `middle+` and `middle` level risky features are available | * `high+`, `high` and `middle+` level risky features are not allowed<BR>* `middle` level risky features are available
|
||||
| normal- | * All features are available | * `high+` and `high` level risky features are not allowed<BR>* `middle+` and `middle` level risky features are available | * `high+`, `high` and `middle+` level risky features are not allowed<BR>* `middle` level risky features are available
|
||||
| weak | * All features are available | * All features are available | * `high+` and `middle+` level risky features are not allowed<BR>* `high`, `middle` and `low` level risky features are available
|
||||
|
||||
|
||||
|
||||
# Disclaimer
|
||||
|
||||
21
__init__.py
21
__init__.py
@@ -1,21 +0,0 @@
|
||||
import os
|
||||
import sys
|
||||
|
||||
cli_mode_flag = os.path.join(os.path.dirname(__file__), '.enable-cli-only-mode')
|
||||
|
||||
if not os.path.exists(cli_mode_flag):
|
||||
sys.path.append(os.path.join(os.path.dirname(__file__), "glob"))
|
||||
import manager_server # noqa: F401
|
||||
import share_3rdparty # noqa: F401
|
||||
import cm_global
|
||||
|
||||
if not cm_global.disable_front and not 'DISABLE_COMFYUI_MANAGER_FRONT' in os.environ:
|
||||
WEB_DIRECTORY = "js"
|
||||
else:
|
||||
print("\n[ComfyUI-Manager] !! cli-only-mode is enabled !!\n")
|
||||
|
||||
NODE_CLASS_MAPPINGS = {}
|
||||
__all__ = ['NODE_CLASS_MAPPINGS']
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +0,0 @@
|
||||
default::https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main
|
||||
recent::https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/node_db/new
|
||||
legacy::https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/node_db/legacy
|
||||
forked::https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/node_db/forked
|
||||
dev::https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/node_db/dev
|
||||
tutorial::https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/node_db/tutorial
|
||||
49
comfyui_manager/README.md
Normal file
49
comfyui_manager/README.md
Normal file
@@ -0,0 +1,49 @@
|
||||
# ComfyUI-Manager: Core Backend (glob)
|
||||
|
||||
This directory contains the Python backend modules that power ComfyUI-Manager, handling the core functionality of node management, downloading, security, and server operations.
|
||||
|
||||
## Directory Structure
|
||||
- **glob/** - code for new cacheless ComfyUI-Manager
|
||||
- **legacy/** - code for legacy ComfyUI-Manager
|
||||
|
||||
## Core Modules
|
||||
- **manager_core.py**: The central implementation of management functions, handling configuration, installation, updates, and node management.
|
||||
- **manager_server.py**: Implements server functionality and API endpoints for the web interface to interact with the backend.
|
||||
|
||||
## Specialized Modules
|
||||
|
||||
- **share_3rdparty.py**: Manages integration with third-party sharing platforms.
|
||||
|
||||
## Architecture
|
||||
|
||||
The backend follows a modular design pattern with clear separation of concerns:
|
||||
|
||||
1. **Core Layer**: Manager modules provide the primary API and business logic
|
||||
2. **Utility Layer**: Helper modules provide specialized functionality
|
||||
3. **Integration Layer**: Modules that connect to external systems
|
||||
|
||||
## Security Model
|
||||
|
||||
The system implements a comprehensive security framework with multiple levels:
|
||||
|
||||
- **Block**: Highest security - blocks most remote operations
|
||||
- **High**: Allows only specific trusted operations
|
||||
- **Middle**: Standard security for most users
|
||||
- **Normal-**: More permissive for advanced users
|
||||
- **Weak**: Lowest security for development environments
|
||||
|
||||
## Implementation Details
|
||||
|
||||
- The backend is designed to work seamlessly with ComfyUI
|
||||
- Asynchronous task queuing is implemented for background operations
|
||||
- The system supports multiple installation modes
|
||||
- Error handling and risk assessment are integrated throughout the codebase
|
||||
|
||||
## API Integration
|
||||
|
||||
The backend exposes a REST API via `manager_server.py` that enables:
|
||||
- Custom node management (install, update, disable, remove)
|
||||
- Model downloading and organization
|
||||
- System configuration
|
||||
- Snapshot management
|
||||
- Workflow component handling
|
||||
104
comfyui_manager/__init__.py
Normal file
104
comfyui_manager/__init__.py
Normal file
@@ -0,0 +1,104 @@
|
||||
import os
|
||||
import logging
|
||||
from aiohttp import web
|
||||
from .common.manager_security import HANDLER_POLICY
|
||||
from .common import manager_security
|
||||
from comfy.cli_args import args
|
||||
|
||||
|
||||
def prestartup():
|
||||
from . import prestartup_script # noqa: F401
|
||||
logging.info('[PRE] ComfyUI-Manager')
|
||||
|
||||
|
||||
def start():
|
||||
logging.info('[START] ComfyUI-Manager')
|
||||
from .common import cm_global # noqa: F401
|
||||
|
||||
if args.enable_manager:
|
||||
if args.enable_manager_legacy_ui:
|
||||
try:
|
||||
from .legacy import manager_server # noqa: F401
|
||||
from .legacy import share_3rdparty # noqa: F401
|
||||
from .legacy import manager_core as core
|
||||
import nodes
|
||||
|
||||
logging.info("[ComfyUI-Manager] Legacy UI is enabled.")
|
||||
nodes.EXTENSION_WEB_DIRS['comfyui-manager-legacy'] = os.path.join(os.path.dirname(__file__), 'js')
|
||||
except Exception as e:
|
||||
print("Error enabling legacy ComfyUI Manager frontend:", e)
|
||||
core = None
|
||||
else:
|
||||
from .glob import manager_server # noqa: F401
|
||||
from .glob import share_3rdparty # noqa: F401
|
||||
from .glob import manager_core as core
|
||||
|
||||
if core is not None:
|
||||
manager_security.is_personal_cloud_mode = core.get_config()['network_mode'].lower() == 'personal_cloud'
|
||||
|
||||
|
||||
def should_be_disabled(fullpath:str) -> bool:
|
||||
"""
|
||||
1. Disables the legacy ComfyUI-Manager.
|
||||
2. The blocklist can be expanded later based on policies.
|
||||
"""
|
||||
if args.enable_manager:
|
||||
# In cases where installation is done via a zip archive, the directory name may not be comfyui-manager, and it may not contain a git repository.
|
||||
# It is assumed that any installed legacy ComfyUI-Manager will have at least 'comfyui-manager' in its directory name.
|
||||
dir_name = os.path.basename(fullpath).lower()
|
||||
if 'comfyui-manager' in dir_name:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def get_client_ip(request):
|
||||
peername = request.transport.get_extra_info("peername")
|
||||
if peername is not None:
|
||||
host, port = peername
|
||||
return host
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def create_middleware():
|
||||
connected_clients = set()
|
||||
is_local_mode = manager_security.is_loopback(args.listen)
|
||||
|
||||
@web.middleware
|
||||
async def manager_middleware(request: web.Request, handler):
|
||||
nonlocal connected_clients
|
||||
|
||||
# security policy for remote environments
|
||||
prev_client_count = len(connected_clients)
|
||||
client_ip = get_client_ip(request)
|
||||
connected_clients.add(client_ip)
|
||||
next_client_count = len(connected_clients)
|
||||
|
||||
if prev_client_count == 1 and next_client_count > 1:
|
||||
manager_security.multiple_remote_alert()
|
||||
|
||||
policy = manager_security.get_handler_policy(handler)
|
||||
is_banned = False
|
||||
|
||||
# policy check
|
||||
if len(connected_clients) > 1:
|
||||
if is_local_mode:
|
||||
if HANDLER_POLICY.MULTIPLE_REMOTE_BAN_NON_LOCAL in policy:
|
||||
is_banned = True
|
||||
if HANDLER_POLICY.MULTIPLE_REMOTE_BAN_NOT_PERSONAL_CLOUD in policy:
|
||||
is_banned = not manager_security.is_personal_cloud_mode
|
||||
|
||||
if HANDLER_POLICY.BANNED in policy:
|
||||
is_banned = True
|
||||
|
||||
if is_banned:
|
||||
logging.warning(f"[Manager] Banning request from {client_ip}: {request.path}")
|
||||
response = web.Response(text="[Manager] This request is banned.", status=403)
|
||||
else:
|
||||
response: web.Response = await handler(request)
|
||||
|
||||
return response
|
||||
|
||||
return manager_middleware
|
||||
|
||||
6
comfyui_manager/channels.list.template
Normal file
6
comfyui_manager/channels.list.template
Normal file
@@ -0,0 +1,6 @@
|
||||
default::https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main
|
||||
recent::https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main/node_db/new
|
||||
legacy::https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main/node_db/legacy
|
||||
forked::https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main/node_db/forked
|
||||
dev::https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main/node_db/dev
|
||||
tutorial::https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main/node_db/tutorial
|
||||
@@ -15,37 +15,38 @@ import git
|
||||
import importlib
|
||||
|
||||
|
||||
sys.path.append(os.path.dirname(__file__))
|
||||
sys.path.append(os.path.join(os.path.dirname(__file__), "glob"))
|
||||
|
||||
import manager_util
|
||||
from ..common import manager_util
|
||||
|
||||
# read env vars
|
||||
# COMFYUI_FOLDERS_BASE_PATH is not required in cm-cli.py
|
||||
# `comfy_path` should be resolved before importing manager_core
|
||||
comfy_path = os.environ.get('COMFYUI_PATH')
|
||||
if comfy_path is None:
|
||||
try:
|
||||
import folder_paths
|
||||
comfy_path = os.path.join(os.path.dirname(folder_paths.__file__))
|
||||
except:
|
||||
print("\n[bold yellow]WARN: The `COMFYUI_PATH` environment variable is not set. Assuming `custom_nodes/ComfyUI-Manager/../../` as the ComfyUI path.[/bold yellow]", file=sys.stderr)
|
||||
comfy_path = os.path.abspath(os.path.join(manager_util.comfyui_manager_path, '..', '..'))
|
||||
|
||||
# This should be placed here
|
||||
comfy_path = os.environ.get('COMFYUI_PATH')
|
||||
|
||||
if comfy_path is None:
|
||||
print("[bold red]cm-cli: environment variable 'COMFYUI_PATH' is not specified.[/bold red]")
|
||||
exit(-1)
|
||||
|
||||
sys.path.append(comfy_path)
|
||||
|
||||
if not os.path.exists(os.path.join(comfy_path, 'folder_paths.py')):
|
||||
print("[bold red]cm-cli: '{comfy_path}' is not a valid 'COMFYUI_PATH' location.[/bold red]")
|
||||
exit(-1)
|
||||
|
||||
|
||||
import utils.extra_config
|
||||
import cm_global
|
||||
import manager_core as core
|
||||
from manager_core import unified_manager
|
||||
import cnr_utils
|
||||
from ..common import cm_global
|
||||
from ..legacy import manager_core as core
|
||||
from ..common import context
|
||||
from ..legacy.manager_core import unified_manager
|
||||
from ..common import cnr_utils
|
||||
|
||||
comfyui_manager_path = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
cm_global.pip_blacklist = {'torch', 'torchaudio', 'torchsde', 'torchvision'}
|
||||
cm_global.pip_downgrade_blacklist = ['torch', 'torchaudio', 'torchsde', 'torchvision', 'transformers', 'safetensors', 'kornia']
|
||||
cm_global.pip_overrides = {'numpy': 'numpy<2'}
|
||||
|
||||
cm_global.pip_overrides = {}
|
||||
|
||||
if os.path.exists(os.path.join(manager_util.comfyui_manager_path, "pip_overrides.json")):
|
||||
with open(os.path.join(manager_util.comfyui_manager_path, "pip_overrides.json"), 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
@@ -65,7 +66,7 @@ def check_comfyui_hash():
|
||||
repo = git.Repo(comfy_path)
|
||||
core.comfy_ui_revision = len(list(repo.iter_commits('HEAD')))
|
||||
core.comfy_ui_commit_datetime = repo.head.commit.committed_datetime
|
||||
except:
|
||||
except Exception:
|
||||
print('[bold yellow]INFO: Frozen ComfyUI mode.[/bold yellow]')
|
||||
core.comfy_ui_revision = 0
|
||||
core.comfy_ui_commit_datetime = 0
|
||||
@@ -81,7 +82,7 @@ def read_downgrade_blacklist():
|
||||
try:
|
||||
import configparser
|
||||
config = configparser.ConfigParser(strict=False)
|
||||
config.read(core.manager_config.path)
|
||||
config.read(context.manager_config_path)
|
||||
default_conf = config['default']
|
||||
|
||||
if 'downgrade_blacklist' in default_conf:
|
||||
@@ -89,7 +90,7 @@ def read_downgrade_blacklist():
|
||||
items = [x.strip() for x in items if x != '']
|
||||
cm_global.pip_downgrade_blacklist += items
|
||||
cm_global.pip_downgrade_blacklist = list(set(cm_global.pip_downgrade_blacklist))
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
@@ -104,7 +105,7 @@ class Ctx:
|
||||
self.no_deps = False
|
||||
self.mode = 'cache'
|
||||
self.user_directory = None
|
||||
self.custom_nodes_paths = [os.path.join(core.comfy_base_path, 'custom_nodes')]
|
||||
self.custom_nodes_paths = [os.path.join(context.comfy_base_path, 'custom_nodes')]
|
||||
self.manager_files_directory = os.path.dirname(__file__)
|
||||
|
||||
if Ctx.folder_paths is None:
|
||||
@@ -142,15 +143,14 @@ class Ctx:
|
||||
if os.path.exists(extra_model_paths_yaml):
|
||||
utils.extra_config.load_extra_path_config(extra_model_paths_yaml)
|
||||
|
||||
core.update_user_directory(user_directory)
|
||||
context.update_user_directory(user_directory)
|
||||
|
||||
if os.path.exists(core.manager_pip_overrides_path):
|
||||
with open(core.manager_pip_overrides_path, 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
if os.path.exists(context.manager_pip_overrides_path):
|
||||
with open(context.manager_pip_overrides_path, 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
cm_global.pip_overrides = json.load(json_file)
|
||||
cm_global.pip_overrides = {'numpy': 'numpy<2'}
|
||||
|
||||
if os.path.exists(core.manager_pip_blacklist_path):
|
||||
with open(core.manager_pip_blacklist_path, 'r', encoding="UTF-8", errors="ignore") as f:
|
||||
if os.path.exists(context.manager_pip_blacklist_path):
|
||||
with open(context.manager_pip_blacklist_path, 'r', encoding="UTF-8", errors="ignore") as f:
|
||||
for x in f.readlines():
|
||||
y = x.strip()
|
||||
if y != '':
|
||||
@@ -163,15 +163,15 @@ class Ctx:
|
||||
|
||||
@staticmethod
|
||||
def get_startup_scripts_path():
|
||||
return os.path.join(core.manager_startup_script_path, "install-scripts.txt")
|
||||
return os.path.join(context.manager_startup_script_path, "install-scripts.txt")
|
||||
|
||||
@staticmethod
|
||||
def get_restore_snapshot_path():
|
||||
return os.path.join(core.manager_startup_script_path, "restore-snapshot.json")
|
||||
return os.path.join(context.manager_startup_script_path, "restore-snapshot.json")
|
||||
|
||||
@staticmethod
|
||||
def get_snapshot_path():
|
||||
return core.manager_snapshot_path
|
||||
return context.manager_snapshot_path
|
||||
|
||||
@staticmethod
|
||||
def get_custom_nodes_paths():
|
||||
@@ -184,13 +184,18 @@ class Ctx:
|
||||
cmd_ctx = Ctx()
|
||||
|
||||
|
||||
def install_node(node_spec_str, is_all=False, cnt_msg=''):
|
||||
def install_node(node_spec_str, is_all=False, cnt_msg='', **kwargs):
|
||||
exit_on_fail = kwargs.get('exit_on_fail', False)
|
||||
print(f"install_node exit on fail:{exit_on_fail}...")
|
||||
|
||||
if core.is_valid_url(node_spec_str):
|
||||
# install via urls
|
||||
res = asyncio.run(core.gitclone_install(node_spec_str, no_deps=cmd_ctx.no_deps))
|
||||
if not res.result:
|
||||
print(res.msg)
|
||||
print(f"[bold red]ERROR: An error occurred while installing '{node_spec_str}'.[/bold red]")
|
||||
if exit_on_fail:
|
||||
sys.exit(1)
|
||||
else:
|
||||
print(f"{cnt_msg} [INSTALLED] {node_spec_str:50}")
|
||||
else:
|
||||
@@ -225,6 +230,8 @@ def install_node(node_spec_str, is_all=False, cnt_msg=''):
|
||||
print("")
|
||||
else:
|
||||
print(f"[bold red]ERROR: An error occurred while installing '{node_name}'.\n{res.msg}[/bold red]")
|
||||
if exit_on_fail:
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def reinstall_node(node_spec_str, is_all=False, cnt_msg=''):
|
||||
@@ -431,8 +438,11 @@ def show_list(kind, simple=False):
|
||||
flag = kind in ['all', 'cnr', 'installed', 'enabled']
|
||||
for k, v in unified_manager.active_nodes.items():
|
||||
if flag:
|
||||
cnr = unified_manager.cnr_map[k]
|
||||
processed[k] = "[ ENABLED ] ", cnr['name'], k, cnr['publisher']['name'], v[0]
|
||||
cnr = unified_manager.cnr_map.get(k)
|
||||
if cnr:
|
||||
processed[k] = "[ ENABLED ] ", cnr['name'], k, cnr['publisher']['name'], v[0]
|
||||
else:
|
||||
processed[k] = None
|
||||
else:
|
||||
processed[k] = None
|
||||
|
||||
@@ -452,8 +462,11 @@ def show_list(kind, simple=False):
|
||||
continue
|
||||
|
||||
if flag:
|
||||
cnr = unified_manager.cnr_map[k]
|
||||
processed[k] = "[ DISABLED ] ", cnr['name'], k, cnr['publisher']['name'], ", ".join(list(v.keys()))
|
||||
cnr = unified_manager.cnr_map.get(k) # NOTE: can this be None if removed from CNR after installed
|
||||
if cnr:
|
||||
processed[k] = "[ DISABLED ] ", cnr['name'], k, cnr['publisher']['name'], ", ".join(list(v.keys()))
|
||||
else:
|
||||
processed[k] = None
|
||||
else:
|
||||
processed[k] = None
|
||||
|
||||
@@ -462,8 +475,11 @@ def show_list(kind, simple=False):
|
||||
continue
|
||||
|
||||
if flag:
|
||||
cnr = unified_manager.cnr_map[k]
|
||||
processed[k] = "[ DISABLED ] ", cnr['name'], k, cnr['publisher']['name'], 'nightly'
|
||||
cnr = unified_manager.cnr_map.get(k)
|
||||
if cnr:
|
||||
processed[k] = "[ DISABLED ] ", cnr['name'], k, cnr['publisher']['name'], 'nightly'
|
||||
else:
|
||||
processed[k] = None
|
||||
else:
|
||||
processed[k] = None
|
||||
|
||||
@@ -483,9 +499,12 @@ def show_list(kind, simple=False):
|
||||
continue
|
||||
|
||||
if flag:
|
||||
cnr = unified_manager.cnr_map[k]
|
||||
ver_spec = v['latest_version']['version'] if 'latest_version' in v else '0.0.0'
|
||||
processed[k] = "[ NOT INSTALLED ] ", cnr['name'], k, cnr['publisher']['name'], ver_spec
|
||||
cnr = unified_manager.cnr_map.get(k)
|
||||
if cnr:
|
||||
ver_spec = v['latest_version']['version'] if 'latest_version' in v else '0.0.0'
|
||||
processed[k] = "[ NOT INSTALLED ] ", cnr['name'], k, cnr['publisher']['name'], ver_spec
|
||||
else:
|
||||
processed[k] = None
|
||||
else:
|
||||
processed[k] = None
|
||||
|
||||
@@ -586,7 +605,7 @@ def get_all_installed_node_specs():
|
||||
return res
|
||||
|
||||
|
||||
def for_each_nodes(nodes, act, allow_all=True):
|
||||
def for_each_nodes(nodes, act, allow_all=True, **kwargs):
|
||||
is_all = False
|
||||
if allow_all and 'all' in nodes:
|
||||
is_all = True
|
||||
@@ -598,7 +617,7 @@ def for_each_nodes(nodes, act, allow_all=True):
|
||||
i = 1
|
||||
for x in nodes:
|
||||
try:
|
||||
act(x, is_all=is_all, cnt_msg=f'{i}/{total}')
|
||||
act(x, is_all=is_all, cnt_msg=f'{i}/{total}', **kwargs)
|
||||
except Exception as e:
|
||||
print(f"ERROR: {e}")
|
||||
traceback.print_exc()
|
||||
@@ -642,13 +661,17 @@ def install(
|
||||
None,
|
||||
help="user directory"
|
||||
),
|
||||
exit_on_fail: bool = typer.Option(
|
||||
False,
|
||||
help="Exit on failure"
|
||||
)
|
||||
):
|
||||
cmd_ctx.set_user_directory(user_directory)
|
||||
cmd_ctx.set_channel_mode(channel, mode)
|
||||
cmd_ctx.set_no_deps(no_deps)
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
for_each_nodes(nodes, act=install_node)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
for_each_nodes(nodes, act=install_node, exit_on_fail=exit_on_fail)
|
||||
pip_fixer.fix_broken()
|
||||
|
||||
|
||||
@@ -685,7 +708,7 @@ def reinstall(
|
||||
cmd_ctx.set_channel_mode(channel, mode)
|
||||
cmd_ctx.set_no_deps(no_deps)
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
for_each_nodes(nodes, act=reinstall_node)
|
||||
pip_fixer.fix_broken()
|
||||
|
||||
@@ -739,7 +762,7 @@ def update(
|
||||
if 'all' in nodes:
|
||||
asyncio.run(auto_save_snapshot())
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
|
||||
for x in nodes:
|
||||
if x.lower() in ['comfyui', 'comfy', 'all']:
|
||||
@@ -840,7 +863,7 @@ def fix(
|
||||
if 'all' in nodes:
|
||||
asyncio.run(auto_save_snapshot())
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
for_each_nodes(nodes, fix_node, allow_all=True)
|
||||
pip_fixer.fix_broken()
|
||||
|
||||
@@ -1117,7 +1140,7 @@ def restore_snapshot(
|
||||
print(f"[bold red]ERROR: `{snapshot_path}` is not exists.[/bold red]")
|
||||
exit(1)
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
try:
|
||||
asyncio.run(core.restore_snapshot(snapshot_path, extras))
|
||||
except Exception:
|
||||
@@ -1149,7 +1172,7 @@ def restore_dependencies(
|
||||
total = len(node_paths)
|
||||
i = 1
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
for x in node_paths:
|
||||
print("----------------------------------------------------------------------------------------------------")
|
||||
print(f"Restoring [{i}/{total}]: {x}")
|
||||
@@ -1168,7 +1191,7 @@ def post_install(
|
||||
):
|
||||
path = os.path.expanduser(path)
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
unified_manager.execute_install_script('', path, instant_execution=True)
|
||||
pip_fixer.fix_broken()
|
||||
|
||||
@@ -1208,11 +1231,11 @@ def install_deps(
|
||||
with open(deps, 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
try:
|
||||
json_obj = json.load(json_file)
|
||||
except:
|
||||
except Exception:
|
||||
print(f"[bold red]Invalid json file: {deps}[/bold red]")
|
||||
exit(1)
|
||||
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, core.manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, context.manager_files_path)
|
||||
for k in json_obj['custom_nodes'].keys():
|
||||
state = core.simple_check_custom_node(k)
|
||||
if state == 'installed':
|
||||
@@ -1269,6 +1292,10 @@ def export_custom_node_ids(
|
||||
print(f"{x['id']}@unknown", file=output_file)
|
||||
|
||||
|
||||
def main():
|
||||
app()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||
sys.exit(app())
|
||||
16
comfyui_manager/common/README.md
Normal file
16
comfyui_manager/common/README.md
Normal file
@@ -0,0 +1,16 @@
|
||||
# ComfyUI-Manager: Core Backend (glob)
|
||||
|
||||
This directory contains the Python backend modules that power ComfyUI-Manager, handling the core functionality of node management, downloading, security, and server operations.
|
||||
|
||||
## Core Modules
|
||||
|
||||
- **manager_downloader.py**: Handles downloading operations for models, extensions, and other resources.
|
||||
- **manager_util.py**: Provides utility functions used throughout the system.
|
||||
|
||||
## Specialized Modules
|
||||
|
||||
- **cm_global.py**: Maintains global variables and state management across the system.
|
||||
- **cnr_utils.py**: Helper utilities for interacting with the custom node registry (CNR).
|
||||
- **git_utils.py**: Git-specific utilities for repository operations.
|
||||
- **node_package.py**: Handles the packaging and installation of node extensions.
|
||||
- **security_check.py**: Implements the multi-level security system for installation safety.
|
||||
0
comfyui_manager/common/__init__.py
Normal file
0
comfyui_manager/common/__init__.py
Normal file
@@ -6,8 +6,9 @@ import time
|
||||
from dataclasses import dataclass
|
||||
from typing import List
|
||||
|
||||
import manager_core
|
||||
import manager_util
|
||||
from . import context
|
||||
from . import manager_util
|
||||
|
||||
import requests
|
||||
import toml
|
||||
|
||||
@@ -47,9 +48,9 @@ async def _get_cnr_data(cache_mode=True, dont_wait=True):
|
||||
# Get ComfyUI version tag
|
||||
if is_desktop:
|
||||
# extract version from pyproject.toml instead of git tag
|
||||
comfyui_ver = manager_core.get_current_comfyui_ver() or 'unknown'
|
||||
comfyui_ver = context.get_current_comfyui_ver() or 'unknown'
|
||||
else:
|
||||
comfyui_ver = manager_core.get_comfyui_tag() or 'unknown'
|
||||
comfyui_ver = context.get_comfyui_tag() or 'unknown'
|
||||
|
||||
if is_desktop:
|
||||
if is_windows:
|
||||
@@ -111,7 +112,7 @@ async def _get_cnr_data(cache_mode=True, dont_wait=True):
|
||||
json_obj = await fetch_all()
|
||||
manager_util.save_to_cache(uri, json_obj)
|
||||
return json_obj['nodes']
|
||||
except:
|
||||
except Exception:
|
||||
res = {}
|
||||
print("Cannot connect to comfyregistry.")
|
||||
finally:
|
||||
@@ -179,7 +180,7 @@ def install_node(node_id, version=None):
|
||||
else:
|
||||
url = f"{base_url}/nodes/{node_id}/install?version={version}"
|
||||
|
||||
response = requests.get(url)
|
||||
response = requests.get(url, verify=not manager_util.bypass_ssl)
|
||||
if response.status_code == 200:
|
||||
# Convert the API response to a NodeVersion object
|
||||
return map_node_version(response.json())
|
||||
@@ -190,7 +191,7 @@ def install_node(node_id, version=None):
|
||||
def all_versions_of_node(node_id):
|
||||
url = f"{base_url}/nodes/{node_id}/versions?statuses=NodeVersionStatusActive&statuses=NodeVersionStatusPending"
|
||||
|
||||
response = requests.get(url)
|
||||
response = requests.get(url, verify=not manager_util.bypass_ssl)
|
||||
if response.status_code == 200:
|
||||
return response.json()
|
||||
else:
|
||||
@@ -210,6 +211,7 @@ def read_cnr_info(fullpath):
|
||||
|
||||
project = data.get('project', {})
|
||||
name = project.get('name').strip().lower()
|
||||
original_name = project.get('name')
|
||||
|
||||
# normalize version
|
||||
# for example: 2.5 -> 2.5.0
|
||||
@@ -221,6 +223,7 @@ def read_cnr_info(fullpath):
|
||||
if name and version: # repository is optional
|
||||
return {
|
||||
"id": name,
|
||||
"original_name": original_name,
|
||||
"version": version,
|
||||
"url": repository
|
||||
}
|
||||
@@ -236,7 +239,7 @@ def generate_cnr_id(fullpath, cnr_id):
|
||||
if not os.path.exists(cnr_id_path):
|
||||
with open(cnr_id_path, "w") as f:
|
||||
return f.write(cnr_id)
|
||||
except:
|
||||
except Exception:
|
||||
print(f"[ComfyUI Manager] unable to create file: {cnr_id_path}")
|
||||
|
||||
|
||||
@@ -246,7 +249,7 @@ def read_cnr_id(fullpath):
|
||||
if os.path.exists(cnr_id_path):
|
||||
with open(cnr_id_path) as f:
|
||||
return f.read().strip()
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return None
|
||||
108
comfyui_manager/common/context.py
Normal file
108
comfyui_manager/common/context.py
Normal file
@@ -0,0 +1,108 @@
|
||||
import sys
|
||||
import os
|
||||
import logging
|
||||
from . import manager_util
|
||||
import toml
|
||||
import git
|
||||
|
||||
|
||||
# read env vars
|
||||
comfy_path: str = os.environ.get('COMFYUI_PATH')
|
||||
comfy_base_path = os.environ.get('COMFYUI_FOLDERS_BASE_PATH')
|
||||
|
||||
if comfy_path is None:
|
||||
try:
|
||||
comfy_path = os.path.abspath(os.path.dirname(sys.modules['__main__'].__file__))
|
||||
os.environ['COMFYUI_PATH'] = comfy_path
|
||||
except Exception:
|
||||
logging.error("[ComfyUI-Manager] environment variable 'COMFYUI_PATH' is not specified.")
|
||||
exit(-1)
|
||||
|
||||
if comfy_base_path is None:
|
||||
comfy_base_path = comfy_path
|
||||
|
||||
channel_list_template_path = os.path.join(manager_util.comfyui_manager_path, 'channels.list.template')
|
||||
git_script_path = os.path.join(manager_util.comfyui_manager_path, "git_helper.py")
|
||||
|
||||
manager_files_path = None
|
||||
manager_config_path = None
|
||||
manager_channel_list_path = None
|
||||
manager_startup_script_path:str = None
|
||||
manager_snapshot_path = None
|
||||
manager_pip_overrides_path = None
|
||||
manager_pip_blacklist_path = None
|
||||
manager_components_path = None
|
||||
manager_batch_history_path = None
|
||||
|
||||
def update_user_directory(user_dir):
|
||||
global manager_files_path
|
||||
global manager_config_path
|
||||
global manager_channel_list_path
|
||||
global manager_startup_script_path
|
||||
global manager_snapshot_path
|
||||
global manager_pip_overrides_path
|
||||
global manager_pip_blacklist_path
|
||||
global manager_components_path
|
||||
global manager_batch_history_path
|
||||
|
||||
manager_files_path = os.path.abspath(os.path.join(user_dir, 'default', 'ComfyUI-Manager'))
|
||||
if not os.path.exists(manager_files_path):
|
||||
os.makedirs(manager_files_path)
|
||||
|
||||
manager_snapshot_path = os.path.join(manager_files_path, "snapshots")
|
||||
if not os.path.exists(manager_snapshot_path):
|
||||
os.makedirs(manager_snapshot_path)
|
||||
|
||||
manager_startup_script_path = os.path.join(manager_files_path, "startup-scripts")
|
||||
if not os.path.exists(manager_startup_script_path):
|
||||
os.makedirs(manager_startup_script_path)
|
||||
|
||||
manager_config_path = os.path.join(manager_files_path, 'config.ini')
|
||||
manager_channel_list_path = os.path.join(manager_files_path, 'channels.list')
|
||||
manager_pip_overrides_path = os.path.join(manager_files_path, "pip_overrides.json")
|
||||
manager_pip_blacklist_path = os.path.join(manager_files_path, "pip_blacklist.list")
|
||||
manager_components_path = os.path.join(manager_files_path, "components")
|
||||
manager_util.cache_dir = os.path.join(manager_files_path, "cache")
|
||||
manager_batch_history_path = os.path.join(manager_files_path, "batch_history")
|
||||
|
||||
if not os.path.exists(manager_util.cache_dir):
|
||||
os.makedirs(manager_util.cache_dir)
|
||||
|
||||
if not os.path.exists(manager_batch_history_path):
|
||||
os.makedirs(manager_batch_history_path)
|
||||
|
||||
try:
|
||||
import folder_paths
|
||||
update_user_directory(folder_paths.get_user_directory())
|
||||
|
||||
except Exception:
|
||||
# fallback:
|
||||
# This case is only possible when running with cm-cli, and in practice, this case is not actually used.
|
||||
update_user_directory(os.path.abspath(manager_util.comfyui_manager_path))
|
||||
|
||||
|
||||
def get_current_comfyui_ver():
|
||||
"""
|
||||
Extract version from pyproject.toml
|
||||
"""
|
||||
toml_path = os.path.join(comfy_path, 'pyproject.toml')
|
||||
if not os.path.exists(toml_path):
|
||||
return None
|
||||
else:
|
||||
try:
|
||||
with open(toml_path, "r", encoding="utf-8") as f:
|
||||
data = toml.load(f)
|
||||
|
||||
project = data.get('project', {})
|
||||
return project.get('version')
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_comfyui_tag():
|
||||
try:
|
||||
with git.Repo(comfy_path) as repo:
|
||||
return repo.git.describe('--tags')
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
18
comfyui_manager/common/enums.py
Normal file
18
comfyui_manager/common/enums.py
Normal file
@@ -0,0 +1,18 @@
|
||||
import enum
|
||||
|
||||
class NetworkMode(enum.Enum):
|
||||
PUBLIC = "public"
|
||||
PRIVATE = "private"
|
||||
OFFLINE = "offline"
|
||||
PERSONAL_CLOUD = "personal_cloud"
|
||||
|
||||
class SecurityLevel(enum.Enum):
|
||||
STRONG = "strong"
|
||||
NORMAL = "normal"
|
||||
NORMAL_MINUS = "normal-minus"
|
||||
WEAK = "weak"
|
||||
|
||||
class DBMode(enum.Enum):
|
||||
LOCAL = "local"
|
||||
CACHE = "cache"
|
||||
REMOTE = "remote"
|
||||
@@ -15,9 +15,12 @@ comfy_path = os.environ.get('COMFYUI_PATH')
|
||||
git_exe_path = os.environ.get('GIT_EXE_PATH')
|
||||
|
||||
if comfy_path is None:
|
||||
print("\nWARN: The `COMFYUI_PATH` environment variable is not set. Assuming `custom_nodes/ComfyUI-Manager/../../` as the ComfyUI path.", file=sys.stderr)
|
||||
comfy_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..'))
|
||||
print("git_helper: environment variable 'COMFYUI_PATH' is not specified.")
|
||||
exit(-1)
|
||||
|
||||
if not os.path.exists(os.path.join(comfy_path, 'folder_paths.py')):
|
||||
print("git_helper: '{comfy_path}' is not a valid 'COMFYUI_PATH' location.")
|
||||
exit(-1)
|
||||
|
||||
def download_url(url, dest_folder, filename=None):
|
||||
# Ensure the destination folder exists
|
||||
@@ -153,27 +156,27 @@ def switch_to_default_branch(repo):
|
||||
default_branch = repo.git.symbolic_ref(f'refs/remotes/{remote_name}/HEAD').replace(f'refs/remotes/{remote_name}/', '')
|
||||
repo.git.checkout(default_branch)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
# try checkout master
|
||||
# try checkout main if failed
|
||||
try:
|
||||
repo.git.checkout(repo.heads.master)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
if remote_name is not None:
|
||||
repo.git.checkout('-b', 'master', f'{remote_name}/master')
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
repo.git.checkout(repo.heads.main)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
if remote_name is not None:
|
||||
repo.git.checkout('-b', 'main', f'{remote_name}/main')
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
print("[ComfyUI Manager] Failed to switch to the default branch")
|
||||
@@ -444,7 +447,7 @@ def restore_pip_snapshot(pips, options):
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_call([sys.executable, '-m', 'pip', 'install'] + non_url)
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# fallback
|
||||
@@ -453,7 +456,7 @@ def restore_pip_snapshot(pips, options):
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_call([sys.executable, '-m', 'pip', 'install', x])
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
@@ -464,7 +467,7 @@ def restore_pip_snapshot(pips, options):
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_call([sys.executable, '-m', 'pip', 'install', x])
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
@@ -475,7 +478,7 @@ def restore_pip_snapshot(pips, options):
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_call([sys.executable, '-m', 'pip', 'install', x])
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
@@ -46,6 +46,8 @@ def git_url(fullpath):
|
||||
|
||||
for k, v in config.items():
|
||||
if k.startswith('remote ') and 'url' in v:
|
||||
if 'Comfy-Org/ComfyUI-Manager' in v['url']:
|
||||
return "https://github.com/ltdrdata/ComfyUI-Manager"
|
||||
return v['url']
|
||||
|
||||
return None
|
||||
@@ -55,7 +55,11 @@ def download_url(model_url: str, model_dir: str, filename: str):
|
||||
return aria2_download_url(model_url, model_dir, filename)
|
||||
else:
|
||||
from torchvision.datasets.utils import download_url as torchvision_download_url
|
||||
return torchvision_download_url(model_url, model_dir, filename)
|
||||
try:
|
||||
return torchvision_download_url(model_url, model_dir, filename)
|
||||
except Exception as e:
|
||||
logging.error(f"[ComfyUI-Manager] Failed to download: {model_url} / {repr(e)}")
|
||||
raise
|
||||
|
||||
|
||||
def aria2_find_task(dir: str, filename: str):
|
||||
36
comfyui_manager/common/manager_security.py
Normal file
36
comfyui_manager/common/manager_security.py
Normal file
@@ -0,0 +1,36 @@
|
||||
from enum import Enum
|
||||
|
||||
is_personal_cloud_mode = False
|
||||
handler_policy = {}
|
||||
|
||||
class HANDLER_POLICY(Enum):
|
||||
MULTIPLE_REMOTE_BAN_NON_LOCAL = 1
|
||||
MULTIPLE_REMOTE_BAN_NOT_PERSONAL_CLOUD = 2
|
||||
BANNED = 3
|
||||
|
||||
|
||||
def is_loopback(address):
|
||||
import ipaddress
|
||||
try:
|
||||
return ipaddress.ip_address(address).is_loopback
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
def do_nothing():
|
||||
pass
|
||||
|
||||
|
||||
def get_handler_policy(x):
|
||||
return handler_policy.get(x) or set()
|
||||
|
||||
def add_handler_policy(x, policy):
|
||||
s = handler_policy.get(x)
|
||||
if s is None:
|
||||
s = set()
|
||||
handler_policy[x] = s
|
||||
|
||||
s.add(policy)
|
||||
|
||||
|
||||
multiple_remote_alert = do_nothing
|
||||
@@ -18,12 +18,16 @@ import shlex
|
||||
|
||||
|
||||
cache_lock = threading.Lock()
|
||||
session_lock = threading.Lock()
|
||||
|
||||
comfyui_manager_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
|
||||
cache_dir = os.path.join(comfyui_manager_path, '.cache') # This path is also updated together in **manager_core.update_user_directory**.
|
||||
|
||||
use_uv = False
|
||||
bypass_ssl = False
|
||||
|
||||
def is_manager_pip_package():
|
||||
return not os.path.exists(os.path.join(comfyui_manager_path, '..', 'custom_nodes'))
|
||||
|
||||
def add_python_path_to_env():
|
||||
if platform.system() != "Windows":
|
||||
@@ -50,7 +54,7 @@ def make_pip_cmd(cmd):
|
||||
# DON'T USE StrictVersion - cannot handle pre_release version
|
||||
# try:
|
||||
# from distutils.version import StrictVersion
|
||||
# except:
|
||||
# except Exception:
|
||||
# print(f"[ComfyUI-Manager] 'distutils' package not found. Activating fallback mode for compatibility.")
|
||||
class StrictVersion:
|
||||
def __init__(self, version_string):
|
||||
@@ -136,7 +140,7 @@ async def get_data(uri, silent=False):
|
||||
print(f"FETCH DATA from: {uri}", end="")
|
||||
|
||||
if uri.startswith("http"):
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=not bypass_ssl)) as session:
|
||||
headers = {
|
||||
'Cache-Control': 'no-cache',
|
||||
'Pragma': 'no-cache',
|
||||
@@ -256,7 +260,7 @@ def get_installed_packages(renew=False):
|
||||
pip_map[normalized_name] = y[1]
|
||||
except subprocess.CalledProcessError:
|
||||
logging.error("[ComfyUI-Manager] Failed to retrieve the information of installed pip packages.")
|
||||
return set()
|
||||
return {}
|
||||
|
||||
return pip_map
|
||||
|
||||
@@ -307,6 +311,7 @@ def parse_requirement_line(line):
|
||||
|
||||
|
||||
torch_torchvision_torchaudio_version_map = {
|
||||
'2.7.0': ('0.22.0', '2.7.0'),
|
||||
'2.6.0': ('0.21.0', '2.6.0'),
|
||||
'2.5.1': ('0.20.0', '2.5.0'),
|
||||
'2.5.0': ('0.20.0', '2.5.0'),
|
||||
@@ -325,6 +330,32 @@ torch_torchvision_torchaudio_version_map = {
|
||||
}
|
||||
|
||||
|
||||
def torch_rollback(prev):
|
||||
spec = prev.split('+')
|
||||
if len(spec) > 1:
|
||||
platform = spec[1]
|
||||
else:
|
||||
cmd = make_pip_cmd(['install', '--force', 'torch', 'torchvision', 'torchaudio'])
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
logging.error(cmd)
|
||||
return
|
||||
|
||||
torch_ver = StrictVersion(spec[0])
|
||||
torch_ver = f"{torch_ver.major}.{torch_ver.minor}.{torch_ver.patch}"
|
||||
torch_torchvision_torchaudio_ver = torch_torchvision_torchaudio_version_map.get(torch_ver)
|
||||
|
||||
if torch_torchvision_torchaudio_ver is None:
|
||||
cmd = make_pip_cmd(['install', '--pre', 'torch', 'torchvision', 'torchaudio',
|
||||
'--index-url', f"https://download.pytorch.org/whl/nightly/{platform}"])
|
||||
logging.info("[ComfyUI-Manager] restore PyTorch to nightly version")
|
||||
else:
|
||||
torchvision_ver, torchaudio_ver = torch_torchvision_torchaudio_ver
|
||||
cmd = make_pip_cmd(['install', f'torch=={torch_ver}', f'torchvision=={torchvision_ver}', f"torchaudio=={torchaudio_ver}",
|
||||
'--index-url', f"https://download.pytorch.org/whl/{platform}"])
|
||||
logging.info(f"[ComfyUI-Manager] restore PyTorch to {torch_ver}+{platform}")
|
||||
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
|
||||
class PIPFixer:
|
||||
def __init__(self, prev_pip_versions, comfyui_path, manager_files_path):
|
||||
@@ -332,32 +363,6 @@ class PIPFixer:
|
||||
self.comfyui_path = comfyui_path
|
||||
self.manager_files_path = manager_files_path
|
||||
|
||||
def torch_rollback(self):
|
||||
spec = self.prev_pip_versions['torch'].split('+')
|
||||
if len(spec) > 0:
|
||||
platform = spec[1]
|
||||
else:
|
||||
cmd = make_pip_cmd(['install', '--force', 'torch', 'torchvision', 'torchaudio'])
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
logging.error(cmd)
|
||||
return
|
||||
|
||||
torch_ver = StrictVersion(spec[0])
|
||||
torch_ver = f"{torch_ver.major}.{torch_ver.minor}.{torch_ver.patch}"
|
||||
torch_torchvision_torchaudio_ver = torch_torchvision_torchaudio_version_map.get(torch_ver)
|
||||
|
||||
if torch_torchvision_torchaudio_ver is None:
|
||||
cmd = make_pip_cmd(['install', '--pre', 'torch', 'torchvision', 'torchaudio',
|
||||
'--index-url', f"https://download.pytorch.org/whl/nightly/{platform}"])
|
||||
logging.info("[ComfyUI-Manager] restore PyTorch to nightly version")
|
||||
else:
|
||||
torchvision_ver, torchaudio_ver = torch_torchvision_torchaudio_ver
|
||||
cmd = make_pip_cmd(['install', f'torch=={torch_ver}', f'torchvision=={torchvision_ver}', f"torchaudio=={torchaudio_ver}",
|
||||
'--index-url', f"https://download.pytorch.org/whl/{platform}"])
|
||||
logging.info(f"[ComfyUI-Manager] restore PyTorch to {torch_ver}+{platform}")
|
||||
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
def fix_broken(self):
|
||||
new_pip_versions = get_installed_packages(True)
|
||||
|
||||
@@ -379,7 +384,7 @@ class PIPFixer:
|
||||
elif self.prev_pip_versions['torch'] != new_pip_versions['torch'] \
|
||||
or self.prev_pip_versions['torchvision'] != new_pip_versions['torchvision'] \
|
||||
or self.prev_pip_versions['torchaudio'] != new_pip_versions['torchaudio']:
|
||||
self.torch_rollback()
|
||||
torch_rollback(self.prev_pip_versions['torch'])
|
||||
except Exception as e:
|
||||
logging.error("[ComfyUI-Manager] Failed to restore PyTorch")
|
||||
logging.error(e)
|
||||
@@ -410,7 +415,7 @@ class PIPFixer:
|
||||
|
||||
if len(targets) > 0:
|
||||
for x in targets:
|
||||
cmd = make_pip_cmd(['install', f"{x}=={versions[0].version_string}", "numpy<2"])
|
||||
cmd = make_pip_cmd(['install', f"{x}=={versions[0].version_string}"])
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
logging.info(f"[ComfyUI-Manager] 'opencv' dependencies were fixed: {targets}")
|
||||
@@ -418,19 +423,6 @@ class PIPFixer:
|
||||
logging.error("[ComfyUI-Manager] Failed to restore opencv")
|
||||
logging.error(e)
|
||||
|
||||
# fix numpy
|
||||
try:
|
||||
np = new_pip_versions.get('numpy')
|
||||
if np is not None:
|
||||
if StrictVersion(np) >= StrictVersion('2'):
|
||||
cmd = make_pip_cmd(['install', "numpy<2"])
|
||||
subprocess.check_output(cmd , universal_newlines=True)
|
||||
|
||||
logging.info("[ComfyUI-Manager] 'numpy' dependency were fixed")
|
||||
except Exception as e:
|
||||
logging.error("[ComfyUI-Manager] Failed to restore numpy")
|
||||
logging.error(e)
|
||||
|
||||
# fix missing frontend
|
||||
try:
|
||||
# NOTE: package name in requirements is 'comfyui-frontend-package'
|
||||
@@ -469,7 +461,7 @@ class PIPFixer:
|
||||
normalized_name = parsed['package'].lower().replace('-', '_')
|
||||
if normalized_name in new_pip_versions:
|
||||
if 'version' in parsed and 'operator' in parsed:
|
||||
cur = StrictVersion(new_pip_versions[parsed['package']])
|
||||
cur = StrictVersion(new_pip_versions[normalized_name])
|
||||
dest = parsed['version']
|
||||
op = parsed['operator']
|
||||
if cur == dest:
|
||||
@@ -517,7 +509,7 @@ def robust_readlines(fullpath):
|
||||
try:
|
||||
with open(fullpath, "r") as f:
|
||||
return f.readlines()
|
||||
except:
|
||||
except Exception:
|
||||
encoding = None
|
||||
with open(fullpath, "rb") as f:
|
||||
raw_data = f.read()
|
||||
@@ -530,3 +522,69 @@ def robust_readlines(fullpath):
|
||||
|
||||
print(f"[ComfyUI-Manager] Failed to recognize encoding for: {fullpath}")
|
||||
return []
|
||||
|
||||
|
||||
def restore_pip_snapshot(pips, options):
|
||||
non_url = []
|
||||
local_url = []
|
||||
non_local_url = []
|
||||
|
||||
for k, v in pips.items():
|
||||
# NOTE: skip torch related packages
|
||||
if k.startswith("torch==") or k.startswith("torchvision==") or k.startswith("torchaudio==") or k.startswith("nvidia-"):
|
||||
continue
|
||||
|
||||
if v == "":
|
||||
non_url.append(k)
|
||||
else:
|
||||
if v.startswith('file:'):
|
||||
local_url.append(v)
|
||||
else:
|
||||
non_local_url.append(v)
|
||||
|
||||
|
||||
# restore other pips
|
||||
failed = []
|
||||
if '--pip-non-url' in options:
|
||||
# try all at once
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install'] + non_url))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# fallback
|
||||
if res != 0:
|
||||
for x in non_url:
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install', '--no-deps', x]))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
failed.append(x)
|
||||
|
||||
if '--pip-non-local-url' in options:
|
||||
for x in non_local_url:
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install', '--no-deps', x]))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
failed.append(x)
|
||||
|
||||
if '--pip-local-url' in options:
|
||||
for x in local_url:
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install', '--no-deps', x]))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
failed.append(x)
|
||||
|
||||
print(f"Installation failed for pip packages: {failed}")
|
||||
@@ -3,7 +3,7 @@ from __future__ import annotations
|
||||
from dataclasses import dataclass
|
||||
import os
|
||||
|
||||
from git_utils import get_commit_hash
|
||||
from .git_utils import get_commit_hash
|
||||
|
||||
|
||||
@dataclass
|
||||
713
comfyui_manager/common/pip_util.design.en.md
Normal file
713
comfyui_manager/common/pip_util.design.en.md
Normal file
@@ -0,0 +1,713 @@
|
||||
# Design Document for pip_util.py Implementation
|
||||
|
||||
This is designed to minimize breaking existing installed dependencies.
|
||||
|
||||
## List of Functions to Implement
|
||||
|
||||
## Global Policy Management
|
||||
|
||||
### Global Variables
|
||||
```python
|
||||
_pip_policy_cache = None # Policy cache (program-wide, loaded once)
|
||||
```
|
||||
|
||||
### Global Functions
|
||||
|
||||
* get_pip_policy(): Returns policy for resolving pip dependency conflicts (lazy loading)
|
||||
- **Call timing**: Called whenever needed (automatically loads only once on first call)
|
||||
- **Purpose**: Returns policy cache, automatically loads if cache is empty
|
||||
- **Execution flow**:
|
||||
1. Declare global _pip_policy_cache
|
||||
2. If _pip_policy_cache is already loaded, return immediately (prevent duplicate loading)
|
||||
3. Read base policy file:
|
||||
- Path: {manager_util.comfyui_manager_path}/pip-policy.json
|
||||
- Use empty dictionary if file doesn't exist
|
||||
- Log error and use empty dictionary if JSON parsing fails
|
||||
4. Read user policy file:
|
||||
- Path: {context.manager_files_path}/pip-policy.user.json
|
||||
- Create empty JSON file if doesn't exist ({"_comment": "User-specific pip policy overrides"})
|
||||
- Log warning and use empty dictionary if JSON parsing fails
|
||||
5. Apply merge rules (merge by package name):
|
||||
- Start with base policy as base
|
||||
- For each package in user policy:
|
||||
* Package only in user policy: add to base
|
||||
* Package only in base policy: keep in base
|
||||
* Package in both: completely replace with user policy (entire package replacement, not section-level)
|
||||
6. Store merged policy in _pip_policy_cache
|
||||
7. Log policy load success (include number of loaded package policies)
|
||||
8. Return _pip_policy_cache
|
||||
- **Return value**: Dict (merged policy dictionary)
|
||||
- **Exception handling**:
|
||||
- File read failure: Log warning and treat file as empty dictionary
|
||||
- JSON parsing failure: Log error and treat file as empty dictionary
|
||||
- **Notes**:
|
||||
- Lazy loading pattern automatically loads on first call
|
||||
- Not thread-safe, caution needed in multi-threaded environments
|
||||
|
||||
- Policy file structure should support the following scenarios:
|
||||
- Dictionary structure of {dependency name -> policy object}
|
||||
- Policy object has four policy sections:
|
||||
- **uninstall**: Package removal policy (pre-processing, condition optional)
|
||||
- **apply_first_match**: Evaluate top-to-bottom and execute only the first policy that satisfies condition (exclusive)
|
||||
- **apply_all_matches**: Execute all policies that satisfy conditions (cumulative)
|
||||
- **restore**: Package restoration policy (post-processing, condition optional)
|
||||
|
||||
- Condition types:
|
||||
- installed: Check version condition of already installed dependencies
|
||||
- spec is optional
|
||||
- package field: Specify package to check (optional, defaults to self)
|
||||
- Explicit: Reference another package (e.g., numba checks numpy version)
|
||||
- Omitted: Check own version (e.g., critical-package checks its own version)
|
||||
- platform: Platform conditions (os, has_gpu, comfyui_version, etc.)
|
||||
- If condition is absent, always considered satisfied
|
||||
|
||||
- uninstall policy (pre-removal policy):
|
||||
- Removal policy list (condition is optional, evaluate top-to-bottom and execute only first match)
|
||||
- When condition satisfied (or always if no condition): remove target package and abort installation
|
||||
- If this policy is applied, all subsequent steps are ignored
|
||||
- target field specifies package to remove
|
||||
- Example: Unconditionally remove if specific package is installed
|
||||
|
||||
- Actions available in apply_first_match (determine installation method, exclusive):
|
||||
- skip: Block installation of specific dependency
|
||||
- force_version: Force change to specific version during installation
|
||||
- extra_index_url field can specify custom package repository (optional)
|
||||
- replace: Replace with different dependency
|
||||
- extra_index_url field can specify custom package repository (optional)
|
||||
|
||||
- Actions available in apply_all_matches (installation options, cumulative):
|
||||
- pin_dependencies: Pin currently installed versions of other dependencies
|
||||
- pinned_packages field specifies package list
|
||||
- Example: `pip install requests urllib3==1.26.15 certifi==2023.7.22 charset-normalizer==3.2.0`
|
||||
- Real use case: Prevent urllib3 from upgrading to 2.x when installing requests
|
||||
- on_failure: "fail" or "retry_without_pin"
|
||||
- install_with: Specify additional dependencies to install together
|
||||
- warn: Record warning message in log
|
||||
|
||||
- restore policy (post-restoration policy):
|
||||
- Restoration policy list (condition is optional, evaluate top-to-bottom and execute only first match)
|
||||
- Executed after package installation completes (post-processing)
|
||||
- When condition satisfied (or always if no condition): force install target package to specific version
|
||||
- target field specifies package to restore (can be different package)
|
||||
- version field specifies version to install
|
||||
- extra_index_url field can specify custom package repository (optional)
|
||||
- Example: Reinstall/change version if specific package is deleted or wrong version
|
||||
|
||||
- Execution order:
|
||||
1. uninstall evaluation: If condition satisfied, remove package and **terminate** (ignore subsequent steps)
|
||||
2. apply_first_match evaluation:
|
||||
- Execute first policy that satisfies condition among skip/force_version/replace
|
||||
- If no matching policy, proceed with default installation of originally requested package
|
||||
3. apply_all_matches evaluation: Apply all pin_dependencies, install_with, warn that satisfy conditions
|
||||
4. Execute actual package installation (pip install or uv pip install)
|
||||
5. restore evaluation: If condition satisfied, restore target package (post-processing)
|
||||
|
||||
## Batch Unit Class (PipBatch)
|
||||
|
||||
### Class Structure
|
||||
```python
|
||||
class PipBatch:
|
||||
"""
|
||||
pip package installation batch unit manager
|
||||
Maintains pip freeze cache during batch operations for performance optimization
|
||||
|
||||
Usage pattern:
|
||||
# Batch operations (policy auto-loaded)
|
||||
with PipBatch() as batch:
|
||||
batch.ensure_not_installed()
|
||||
batch.install("numpy>=1.20")
|
||||
batch.install("pandas>=2.0")
|
||||
batch.install("scipy>=1.7")
|
||||
batch.ensure_installed()
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self._installed_cache = None # Installed packages cache (batch-level)
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self._installed_cache = None
|
||||
```
|
||||
|
||||
### Private Methods
|
||||
|
||||
* PipBatch._refresh_installed_cache():
|
||||
- **Purpose**: Read currently installed package information and refresh cache
|
||||
- **Execution flow**:
|
||||
1. Generate command using manager_util.make_pip_cmd(["freeze"])
|
||||
2. Execute pip freeze via subprocess
|
||||
3. Parse output:
|
||||
- Each line is in "package_name==version" format
|
||||
- Parse "package_name==version" to create dictionary
|
||||
- Ignore editable packages (starting with -e)
|
||||
- Ignore comments (starting with #)
|
||||
4. Store parsed dictionary in self._installed_cache
|
||||
- **Return value**: None
|
||||
- **Exception handling**:
|
||||
- pip freeze failure: Set cache to empty dictionary and log warning
|
||||
- Parse failure: Ignore line and continue
|
||||
|
||||
* PipBatch._get_installed_packages():
|
||||
- **Purpose**: Return cached installed package information (refresh if cache is None)
|
||||
- **Execution flow**:
|
||||
1. If self._installed_cache is None, call _refresh_installed_cache()
|
||||
2. Return self._installed_cache
|
||||
- **Return value**: {package_name: version} dictionary
|
||||
|
||||
* PipBatch._invalidate_cache():
|
||||
- **Purpose**: Invalidate cache after package install/uninstall
|
||||
- **Execution flow**:
|
||||
1. Set self._installed_cache = None
|
||||
- **Return value**: None
|
||||
- **Call timing**: After install(), ensure_not_installed(), ensure_installed()
|
||||
|
||||
* PipBatch._parse_package_spec(package_info):
|
||||
- **Purpose**: Split package spec string into package name and version spec
|
||||
- **Parameters**:
|
||||
- package_info: "numpy", "numpy==1.26.0", "numpy>=1.20.0", "numpy~=1.20", etc.
|
||||
- **Execution flow**:
|
||||
1. Use regex to split package name and version spec
|
||||
2. Pattern: `^([a-zA-Z0-9_-]+)([><=!~]+.*)?$`
|
||||
- **Return value**: (package_name, version_spec) tuple
|
||||
- Examples: ("numpy", "==1.26.0"), ("pandas", ">=2.0.0"), ("scipy", None)
|
||||
- **Exception handling**:
|
||||
- Parse failure: Raise ValueError
|
||||
|
||||
* PipBatch._evaluate_condition(condition, package_name, installed_packages):
|
||||
- **Purpose**: Evaluate policy condition and return whether satisfied
|
||||
- **Parameters**:
|
||||
- condition: Policy condition object (dictionary)
|
||||
- package_name: Name of package currently being processed
|
||||
- installed_packages: {package_name: version} dictionary
|
||||
- **Execution flow**:
|
||||
1. If condition is None, return True (always satisfied)
|
||||
2. Branch based on condition["type"]:
|
||||
a. "installed" type:
|
||||
- target_package = condition.get("package", package_name)
|
||||
- Check current version with installed_packages.get(target_package)
|
||||
- If not installed (None), return False
|
||||
- If spec exists, compare version using packaging.specifiers.SpecifierSet
|
||||
- If no spec, only check installation status (True)
|
||||
b. "platform" type:
|
||||
- If condition["os"] exists, compare with platform.system()
|
||||
- If condition["has_gpu"] exists, check GPU presence (torch.cuda.is_available(), etc.)
|
||||
- If condition["comfyui_version"] exists, compare ComfyUI version
|
||||
- Return True if all conditions satisfied
|
||||
3. Return True if all conditions satisfied, False if any unsatisfied
|
||||
- **Return value**: bool
|
||||
- **Exception handling**:
|
||||
- Version comparison failure: Log warning and return False
|
||||
- Unknown condition type: Log warning and return False
|
||||
|
||||
|
||||
### Public Methods
|
||||
|
||||
* PipBatch.install(package_info, extra_index_url=None, override_policy=False):
|
||||
- **Purpose**: Perform policy-based pip package installation (individual package basis)
|
||||
- **Parameters**:
|
||||
- package_info: Package name and version spec (e.g., "numpy", "numpy==1.26.0", "numpy>=1.20.0")
|
||||
- extra_index_url: Additional package repository URL (optional)
|
||||
- override_policy: If True, skip policy application and install directly (default: False)
|
||||
- **Execution flow**:
|
||||
1. Call get_pip_policy() to get policy (lazy loading)
|
||||
2. Use self._parse_package_spec() to split package_info into package name and version spec
|
||||
3. Call self._get_installed_packages() to get cached installed package information
|
||||
4. If override_policy=True → Jump directly to step 10 (skip policy)
|
||||
5. Get policy for package name from policy dictionary
|
||||
6. If no policy → Jump to step 10 (default installation)
|
||||
7. **apply_first_match policy evaluation** (exclusive - only first match):
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- When first condition-satisfying policy found:
|
||||
* type="skip": Log reason and return False (don't install)
|
||||
* type="force_version": Change package_info version to policy's version
|
||||
* type="replace": Completely replace package_info with policy's replacement package
|
||||
- If no matching policy, keep original package_info
|
||||
8. **apply_all_matches policy evaluation** (cumulative - all matches):
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- For all condition-satisfying policies:
|
||||
* type="pin_dependencies":
|
||||
- For each package in pinned_packages, query current version with self._installed_cache.get(pkg)
|
||||
- Pin to installed version in "package==version" format
|
||||
- Add to installation package list
|
||||
* type="install_with":
|
||||
- Add additional_packages to installation package list
|
||||
* type="warn":
|
||||
- Output message as warning log
|
||||
- If allow_continue=false, wait for user confirmation (optional)
|
||||
9. Compose final installation package list:
|
||||
- Main package (modified/replaced package_info)
|
||||
- Packages pinned by pin_dependencies
|
||||
- Packages added by install_with
|
||||
10. Handle extra_index_url:
|
||||
- Parameter-passed extra_index_url takes priority
|
||||
- Otherwise use extra_index_url defined in policy
|
||||
11. Generate pip/uv command using manager_util.make_pip_cmd():
|
||||
- Basic format: ["pip", "install"] + package list
|
||||
- If extra_index_url exists: add ["--extra-index-url", url]
|
||||
12. Execute command via subprocess
|
||||
13. Handle installation failure:
|
||||
- If pin_dependencies's on_failure="retry_without_pin":
|
||||
* Retry with only main package excluding pinned packages
|
||||
- If on_failure="fail":
|
||||
* Raise exception and abort installation
|
||||
- Otherwise: Log warning and continue
|
||||
14. On successful installation:
|
||||
- Call self._invalidate_cache() (invalidate cache)
|
||||
- Log info if reason exists
|
||||
- Return True
|
||||
- **Return value**: Installation success status (bool)
|
||||
- **Exception handling**:
|
||||
- Policy parsing failure: Log warning and proceed with default installation
|
||||
- Installation failure: Log error and raise exception (depends on on_failure setting)
|
||||
- **Notes**:
|
||||
- restore policy not handled in this method (batch-processed in ensure_installed())
|
||||
- uninstall policy not handled in this method (batch-processed in ensure_not_installed())
|
||||
|
||||
* PipBatch.ensure_not_installed():
|
||||
- **Purpose**: Iterate through all policies and remove all packages satisfying uninstall conditions (batch processing)
|
||||
- **Parameters**: None
|
||||
- **Execution flow**:
|
||||
1. Call get_pip_policy() to get policy (lazy loading)
|
||||
2. Call self._get_installed_packages() to get cached installed package information
|
||||
3. Iterate through all package policies in policy dictionary:
|
||||
a. Check if each package has uninstall policy
|
||||
b. If uninstall policy exists:
|
||||
- Iterate through uninstall policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- When first condition-satisfying policy found:
|
||||
* Check if target package exists in self._installed_cache
|
||||
* If installed:
|
||||
- Generate command with manager_util.make_pip_cmd(["uninstall", "-y", target])
|
||||
- Execute pip uninstall via subprocess
|
||||
- Log reason in info log
|
||||
- Add to removed package list
|
||||
- Remove package from self._installed_cache
|
||||
* Move to next package (only first match per package)
|
||||
4. Complete iteration through all package policies
|
||||
- **Return value**: List of removed package names (list of str)
|
||||
- **Exception handling**:
|
||||
- Individual package removal failure: Log warning only and continue to next package
|
||||
- **Call timing**:
|
||||
- Called at batch operation start to pre-remove conflicting packages
|
||||
- Called before multiple package installations to clean installation environment
|
||||
|
||||
* PipBatch.ensure_installed():
|
||||
- **Purpose**: Iterate through all policies and restore all packages satisfying restore conditions (batch processing)
|
||||
- **Parameters**: None
|
||||
- **Execution flow**:
|
||||
1. Call get_pip_policy() to get policy (lazy loading)
|
||||
2. Call self._get_installed_packages() to get cached installed package information
|
||||
3. Iterate through all package policies in policy dictionary:
|
||||
a. Check if each package has restore policy
|
||||
b. If restore policy exists:
|
||||
- Iterate through restore policy list top-to-bottom
|
||||
- Evaluate each policy's condition with self._evaluate_condition()
|
||||
- When first condition-satisfying policy found:
|
||||
* Get target package name (policy's "target" field)
|
||||
* Get version specified in version field
|
||||
* Check current version with self._installed_cache.get(target)
|
||||
* If current version is None or different from specified version:
|
||||
- Compose as package_spec = f"{target}=={version}" format
|
||||
- Generate command with manager_util.make_pip_cmd(["install", package_spec])
|
||||
- If extra_index_url exists, add ["--extra-index-url", url]
|
||||
- Execute pip install via subprocess
|
||||
- Log reason in info log
|
||||
- Add to restored package list
|
||||
- Update cache: self._installed_cache[target] = version
|
||||
* Move to next package (only first match per package)
|
||||
4. Complete iteration through all package policies
|
||||
- **Return value**: List of restored package names (list of str)
|
||||
- **Exception handling**:
|
||||
- Individual package installation failure: Log warning only and continue to next package
|
||||
- **Call timing**:
|
||||
- Called at batch operation end to restore essential package versions
|
||||
- Called for environment verification after multiple package installations
|
||||
|
||||
|
||||
## pip-policy.json Examples
|
||||
|
||||
### Base Policy File ({manager_util.comfyui_manager_path}/pip-policy.json)
|
||||
```json
|
||||
{
|
||||
"torch": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "skip",
|
||||
"reason": "PyTorch installation should be managed manually due to CUDA compatibility"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"opencv-python": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "replace",
|
||||
"replacement": "opencv-contrib-python",
|
||||
"version": ">=4.8.0",
|
||||
"reason": "opencv-contrib-python includes all opencv-python features plus extras"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"PIL": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"type": "replace",
|
||||
"replacement": "Pillow",
|
||||
"reason": "PIL is deprecated, use Pillow instead"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"click": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "colorama",
|
||||
"spec": "<0.5.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "8.1.3",
|
||||
"reason": "click 8.1.3 compatible with colorama <0.5"
|
||||
}
|
||||
],
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["colorama"],
|
||||
"reason": "Prevent colorama upgrade that may break compatibility"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"requests": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["urllib3", "certifi", "charset-normalizer"],
|
||||
"on_failure": "retry_without_pin",
|
||||
"reason": "Prevent urllib3 from upgrading to 2.x which has breaking changes"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"six": {
|
||||
"restore": [
|
||||
{
|
||||
"target": "six",
|
||||
"version": "1.16.0",
|
||||
"reason": "six must be maintained at 1.16.0 for compatibility"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"urllib3": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"spec": "!=1.26.15"
|
||||
},
|
||||
"target": "urllib3",
|
||||
"version": "1.26.15",
|
||||
"reason": "urllib3 must be 1.26.15 for compatibility with legacy code"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"onnxruntime": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"os": "linux",
|
||||
"has_gpu": true
|
||||
},
|
||||
"type": "replace",
|
||||
"replacement": "onnxruntime-gpu",
|
||||
"reason": "Use GPU version on Linux with CUDA"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"legacy-custom-node-package": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"comfyui_version": "<1.0.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "0.9.0",
|
||||
"reason": "legacy-custom-node-package 0.9.0 is compatible with ComfyUI <1.0.0"
|
||||
},
|
||||
{
|
||||
"condition": {
|
||||
"type": "platform",
|
||||
"comfyui_version": ">=1.0.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "1.5.0",
|
||||
"reason": "legacy-custom-node-package 1.5.0 is required for ComfyUI >=1.0.0"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"tensorflow": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "torch"
|
||||
},
|
||||
"type": "warn",
|
||||
"message": "Installing TensorFlow alongside PyTorch may cause CUDA conflicts",
|
||||
"allow_continue": true
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"some-package": {
|
||||
"uninstall": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "conflicting-package",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"target": "conflicting-package",
|
||||
"reason": "conflicting-package >=2.0.0 conflicts with some-package"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"banned-malicious-package": {
|
||||
"uninstall": [
|
||||
{
|
||||
"target": "banned-malicious-package",
|
||||
"reason": "Security vulnerability CVE-2024-XXXXX, always remove if attempting to install"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"critical-package": {
|
||||
"restore": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "critical-package",
|
||||
"spec": "!=1.2.3"
|
||||
},
|
||||
"target": "critical-package",
|
||||
"version": "1.2.3",
|
||||
"extra_index_url": "https://custom-repo.example.com/simple",
|
||||
"reason": "critical-package must be version 1.2.3, restore if different or missing"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"stable-package": {
|
||||
"apply_first_match": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "critical-dependency",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"type": "force_version",
|
||||
"version": "1.5.0",
|
||||
"extra_index_url": "https://custom-repo.example.com/simple",
|
||||
"reason": "stable-package 1.5.0 is required when critical-dependency >=2.0.0 is installed"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"new-experimental-package": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["numpy", "pandas", "scipy"],
|
||||
"on_failure": "retry_without_pin",
|
||||
"reason": "new-experimental-package may upgrade numpy/pandas/scipy, pin them to prevent breakage"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
"pytorch-addon": {
|
||||
"apply_all_matches": [
|
||||
{
|
||||
"condition": {
|
||||
"type": "installed",
|
||||
"package": "torch",
|
||||
"spec": ">=2.0.0"
|
||||
},
|
||||
"type": "pin_dependencies",
|
||||
"pinned_packages": ["torch", "torchvision", "torchaudio"],
|
||||
"on_failure": "fail",
|
||||
"reason": "pytorch-addon must not change PyTorch ecosystem versions"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Structure Schema
|
||||
```json
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"type": "object",
|
||||
"patternProperties": {
|
||||
"^.*$": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"uninstall": {
|
||||
"type": "array",
|
||||
"description": "When condition satisfied (or always if no condition), remove package and terminate",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["target"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always remove if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"target": {
|
||||
"type": "string",
|
||||
"description": "Package name to remove"
|
||||
},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"restore": {
|
||||
"type": "array",
|
||||
"description": "When condition satisfied (or always if no condition), restore package and terminate",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["target", "version"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always restore if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"target": {
|
||||
"type": "string",
|
||||
"description": "Package name to restore"
|
||||
},
|
||||
"version": {
|
||||
"type": "string",
|
||||
"description": "Version to restore"
|
||||
},
|
||||
"extra_index_url": {"type": "string"},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"apply_first_match": {
|
||||
"type": "array",
|
||||
"description": "Execute only first condition-satisfying policy (exclusive)",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always apply if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"type": {
|
||||
"enum": ["skip", "force_version", "replace"],
|
||||
"description": "Exclusive action: determines installation method"
|
||||
},
|
||||
"version": {"type": "string"},
|
||||
"replacement": {"type": "string"},
|
||||
"extra_index_url": {"type": "string"},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
},
|
||||
"apply_all_matches": {
|
||||
"type": "array",
|
||||
"description": "Execute all condition-satisfying policies (cumulative)",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "object",
|
||||
"description": "Optional: always apply if absent",
|
||||
"required": ["type"],
|
||||
"properties": {
|
||||
"type": {"enum": ["installed", "platform"]},
|
||||
"package": {"type": "string", "description": "Optional: defaults to self"},
|
||||
"spec": {"type": "string", "description": "Optional: version condition"},
|
||||
"os": {"type": "string"},
|
||||
"has_gpu": {"type": "boolean"},
|
||||
"comfyui_version": {"type": "string"}
|
||||
}
|
||||
},
|
||||
"type": {
|
||||
"enum": ["pin_dependencies", "install_with", "warn"],
|
||||
"description": "Cumulative action: adds installation options"
|
||||
},
|
||||
"pinned_packages": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"}
|
||||
},
|
||||
"on_failure": {"enum": ["fail", "retry_without_pin"]},
|
||||
"additional_packages": {"type": "array"},
|
||||
"message": {"type": "string"},
|
||||
"allow_continue": {"type": "boolean"},
|
||||
"reason": {"type": "string"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
## Error Handling
|
||||
|
||||
* Default behavior when errors occur during policy execution:
|
||||
- Log error and continue
|
||||
- Only treat as installation failure when pin_dependencies's on_failure="fail"
|
||||
- For other cases, leave warning and attempt originally requested installation
|
||||
|
||||
|
||||
* pip_install: Performs pip package installation
|
||||
- Use manager_util.make_pip_cmd to generate commands for selective application of uv and pip
|
||||
- Provide functionality to skip policy application through override_policy flag
|
||||
614
comfyui_manager/common/pip_util.implementation-plan.en.md
Normal file
614
comfyui_manager/common/pip_util.implementation-plan.en.md
Normal file
@@ -0,0 +1,614 @@
|
||||
# pip_util.py Implementation Plan Document
|
||||
|
||||
## 1. Project Overview
|
||||
|
||||
### Purpose
|
||||
Implement a policy-based pip package management system that minimizes breaking existing installed dependencies
|
||||
|
||||
### Core Features
|
||||
- JSON-based policy file loading and merging (lazy loading)
|
||||
- Per-package installation policy evaluation and application
|
||||
- Performance optimization through batch-level pip freeze caching
|
||||
- Automated conditional package removal/restoration
|
||||
|
||||
### Technology Stack
|
||||
- Python 3.x
|
||||
- packaging library (version comparison)
|
||||
- subprocess (pip command execution)
|
||||
- json (policy file parsing)
|
||||
|
||||
---
|
||||
|
||||
## 2. Architecture Design
|
||||
|
||||
### 2.1 Global Policy Management (Lazy Loading Pattern)
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────┐
|
||||
│ get_pip_policy() │
|
||||
│ - Auto-loads policy files on │
|
||||
│ first call via lazy loading │
|
||||
│ - Returns cache on subsequent calls│
|
||||
└─────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────┐
|
||||
│ _pip_policy_cache (global) │
|
||||
│ - Merged policy dictionary │
|
||||
│ - {package_name: policy_object} │
|
||||
└─────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 2.2 Batch Operation Class (PipBatch)
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────┐
|
||||
│ PipBatch (Context Manager) │
|
||||
│ ┌───────────────────────────────┐ │
|
||||
│ │ _installed_cache │ │
|
||||
│ │ - Caches pip freeze results │ │
|
||||
│ │ - {package: version} │ │
|
||||
│ └───────────────────────────────┘ │
|
||||
│ │
|
||||
│ Public Methods: │
|
||||
│ ├─ install() │
|
||||
│ ├─ ensure_not_installed() │
|
||||
│ └─ ensure_installed() │
|
||||
│ │
|
||||
│ Private Methods: │
|
||||
│ ├─ _get_installed_packages() │
|
||||
│ ├─ _refresh_installed_cache() │
|
||||
│ ├─ _invalidate_cache() │
|
||||
│ ├─ _parse_package_spec() │
|
||||
│ └─ _evaluate_condition() │
|
||||
└─────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 2.3 Policy Evaluation Flow
|
||||
|
||||
```
|
||||
install("numpy>=1.20") called
|
||||
│
|
||||
▼
|
||||
get_pip_policy() → Load policy (lazy)
|
||||
│
|
||||
▼
|
||||
Parse package name: "numpy"
|
||||
│
|
||||
▼
|
||||
Look up "numpy" policy in policy dictionary
|
||||
│
|
||||
├─ Evaluate apply_first_match (exclusive)
|
||||
│ ├─ skip → Return False (don't install)
|
||||
│ ├─ force_version → Change version
|
||||
│ └─ replace → Replace package
|
||||
│
|
||||
├─ Evaluate apply_all_matches (cumulative)
|
||||
│ ├─ pin_dependencies → Pin dependencies
|
||||
│ ├─ install_with → Additional packages
|
||||
│ └─ warn → Warning log
|
||||
│
|
||||
▼
|
||||
Execute pip install
|
||||
│
|
||||
▼
|
||||
Invalidate cache (_invalidate_cache)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. Phase-by-Phase Implementation Plan
|
||||
|
||||
### Phase 1: Core Infrastructure Setup (2-3 hours)
|
||||
|
||||
#### Task 1.1: Project Structure and Dependency Setup (30 min)
|
||||
**Implementation**:
|
||||
- Create `pip_util.py` file
|
||||
- Add necessary import statements
|
||||
```python
|
||||
import json
|
||||
import logging
|
||||
import platform
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
from packaging.specifiers import SpecifierSet
|
||||
from packaging.version import Version
|
||||
|
||||
from . import manager_util, context
|
||||
```
|
||||
- Set up logging
|
||||
```python
|
||||
logger = logging.getLogger(__name__)
|
||||
```
|
||||
|
||||
**Validation**:
|
||||
- Module loads without import errors
|
||||
- Logger works correctly
|
||||
|
||||
#### Task 1.2: Global Variable and get_pip_policy() Implementation (1 hour)
|
||||
**Implementation**:
|
||||
- Declare global variable
|
||||
```python
|
||||
_pip_policy_cache: Optional[Dict] = None
|
||||
```
|
||||
- Implement `get_pip_policy()` function
|
||||
- Check cache and early return
|
||||
- Read base policy file (`{manager_util.comfyui_manager_path}/pip-policy.json`)
|
||||
- Read user policy file (`{context.manager_files_path}/pip-policy.user.json`)
|
||||
- Create file if doesn't exist (for user policy)
|
||||
- Merge policies (complete package-level replacement)
|
||||
- Save to cache and return
|
||||
|
||||
**Exception Handling**:
|
||||
- `FileNotFoundError`: File not found → Use empty dictionary
|
||||
- `json.JSONDecodeError`: JSON parse failure → Warning log + empty dictionary
|
||||
- General exception: Warning log + empty dictionary
|
||||
|
||||
**Validation**:
|
||||
- Returns empty dictionary when policy files don't exist
|
||||
- Returns correct merged result when policy files exist
|
||||
- Confirms cache usage on second call (load log appears only once)
|
||||
|
||||
#### Task 1.3: PipBatch Class Basic Structure (30 min)
|
||||
**Implementation**:
|
||||
- Class definition and `__init__`
|
||||
```python
|
||||
class PipBatch:
|
||||
def __init__(self):
|
||||
self._installed_cache: Optional[Dict[str, str]] = None
|
||||
```
|
||||
- Context manager methods (`__enter__`, `__exit__`)
|
||||
```python
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self._installed_cache = None
|
||||
return False
|
||||
```
|
||||
|
||||
**Validation**:
|
||||
- `with PipBatch() as batch:` syntax works correctly
|
||||
- Cache cleared on `__exit__` call
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Caching and Utility Methods (2-3 hours)
|
||||
|
||||
#### Task 2.1: pip freeze Caching Methods (1 hour)
|
||||
**Implementation**:
|
||||
- Implement `_refresh_installed_cache()`
|
||||
- Call `manager_util.make_pip_cmd(["freeze"])`
|
||||
- Execute command via subprocess
|
||||
- Parse output (package==version format)
|
||||
- Exclude editable packages (-e) and comments (#)
|
||||
- Convert to dictionary and store in `self._installed_cache`
|
||||
|
||||
- Implement `_get_installed_packages()`
|
||||
- Call `_refresh_installed_cache()` if cache is None
|
||||
- Return cache
|
||||
|
||||
- Implement `_invalidate_cache()`
|
||||
- Set `self._installed_cache = None`
|
||||
|
||||
**Exception Handling**:
|
||||
- `subprocess.CalledProcessError`: pip freeze failure → Empty dictionary
|
||||
- Parse error: Ignore line + warning log
|
||||
|
||||
**Validation**:
|
||||
- pip freeze results correctly parsed into dictionary
|
||||
- New load occurs after cache invalidation and re-query
|
||||
|
||||
#### Task 2.2: Package Spec Parsing (30 min)
|
||||
**Implementation**:
|
||||
- Implement `_parse_package_spec(package_info)`
|
||||
- Regex pattern: `^([a-zA-Z0-9_-]+)([><=!~]+.*)?$`
|
||||
- Split package name and version spec
|
||||
- Return tuple: `(package_name, version_spec)`
|
||||
|
||||
**Exception Handling**:
|
||||
- Parse failure: Raise `ValueError`
|
||||
|
||||
**Validation**:
|
||||
- "numpy" → ("numpy", None)
|
||||
- "numpy==1.26.0" → ("numpy", "==1.26.0")
|
||||
- "pandas>=2.0.0" → ("pandas", ">=2.0.0")
|
||||
- Invalid format → ValueError
|
||||
|
||||
#### Task 2.3: Condition Evaluation Method (1.5 hours)
|
||||
**Implementation**:
|
||||
- Implement `_evaluate_condition(condition, package_name, installed_packages)`
|
||||
|
||||
**Handling by Condition Type**:
|
||||
1. **condition is None**: Always return True
|
||||
2. **"installed" type**:
|
||||
- `target_package = condition.get("package", package_name)`
|
||||
- Check version with `installed_packages.get(target_package)`
|
||||
- If spec exists, compare using `packaging.specifiers.SpecifierSet`
|
||||
- If no spec, only check installation status
|
||||
3. **"platform" type**:
|
||||
- `os` condition: Compare with `platform.system()`
|
||||
- `has_gpu` condition: Check `torch.cuda.is_available()` (False if torch unavailable)
|
||||
- `comfyui_version` condition: TODO (currently warning)
|
||||
|
||||
**Exception Handling**:
|
||||
- Version comparison failure: Warning log + return False
|
||||
- Unknown condition type: Warning log + return False
|
||||
|
||||
**Validation**:
|
||||
- Write test cases for each condition type
|
||||
- Verify edge case handling (torch not installed, invalid version format, etc.)
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Core Installation Logic Implementation (4-5 hours)
|
||||
|
||||
#### Task 3.1: install() Method - Basic Flow (2 hours)
|
||||
**Implementation**:
|
||||
1. Parse package spec (`_parse_package_spec`)
|
||||
2. Query installed package cache (`_get_installed_packages`)
|
||||
3. If `override_policy=True`, install directly and return
|
||||
4. Call `get_pip_policy()` to load policy
|
||||
5. Default installation if no policy exists
|
||||
|
||||
**Validation**:
|
||||
- Verify policy ignored when override_policy=True
|
||||
- Verify default installation for packages without policy
|
||||
|
||||
#### Task 3.2: install() Method - apply_first_match Policy (1 hour)
|
||||
**Implementation**:
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition (`_evaluate_condition`)
|
||||
- When condition satisfied:
|
||||
- **skip**: Log reason and return False
|
||||
- **force_version**: Force version change
|
||||
- **replace**: Replace package
|
||||
- Apply only first match (break)
|
||||
|
||||
**Validation**:
|
||||
- Verify installation blocked by skip policy
|
||||
- Verify version changed by force_version
|
||||
- Verify package replaced by replace
|
||||
|
||||
#### Task 3.3: install() Method - apply_all_matches Policy (1 hour)
|
||||
**Implementation**:
|
||||
- Iterate through policy list top-to-bottom
|
||||
- Evaluate each policy's condition
|
||||
- Apply all condition-satisfying policies:
|
||||
- **pin_dependencies**: Pin to installed version
|
||||
- **install_with**: Add to additional package list
|
||||
- **warn**: Output warning log
|
||||
|
||||
**Validation**:
|
||||
- Verify multiple policies applied simultaneously
|
||||
- Verify version pinning by pin_dependencies
|
||||
- Verify additional package installation by install_with
|
||||
|
||||
#### Task 3.4: install() Method - Installation Execution and Retry Logic (1 hour)
|
||||
**Implementation**:
|
||||
1. Compose final package list
|
||||
2. Generate command using `manager_util.make_pip_cmd()`
|
||||
3. Handle `extra_index_url`
|
||||
4. Execute installation via subprocess
|
||||
5. Handle failure based on on_failure setting:
|
||||
- `retry_without_pin`: Retry without pins
|
||||
- `fail`: Raise exception
|
||||
- Other: Warning log
|
||||
6. Invalidate cache on success
|
||||
|
||||
**Validation**:
|
||||
- Verify normal installation
|
||||
- Verify retry logic on pin failure
|
||||
- Verify error handling
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Batch Operation Methods Implementation (2-3 hours)
|
||||
|
||||
#### Task 4.1: ensure_not_installed() Implementation (1.5 hours)
|
||||
**Implementation**:
|
||||
1. Call `get_pip_policy()`
|
||||
2. Iterate through all package policies
|
||||
3. Check each package's uninstall policy
|
||||
4. When condition satisfied:
|
||||
- Check if target package is installed
|
||||
- If installed, execute `pip uninstall -y {target}`
|
||||
- Remove from cache
|
||||
- Add to removal list
|
||||
5. Execute only first match (per package)
|
||||
6. Return list of removed packages
|
||||
|
||||
**Exception Handling**:
|
||||
- Individual package removal failure: Warning log + continue
|
||||
|
||||
**Validation**:
|
||||
- Verify package removal by uninstall policy
|
||||
- Verify batch removal of multiple packages
|
||||
- Verify continued processing of other packages even on removal failure
|
||||
|
||||
#### Task 4.2: ensure_installed() Implementation (1.5 hours)
|
||||
**Implementation**:
|
||||
1. Call `get_pip_policy()`
|
||||
2. Iterate through all package policies
|
||||
3. Check each package's restore policy
|
||||
4. When condition satisfied:
|
||||
- Check target package's current version
|
||||
- If absent or different version:
|
||||
- Execute `pip install {target}=={version}`
|
||||
- Add extra_index_url if present
|
||||
- Update cache
|
||||
- Add to restoration list
|
||||
5. Execute only first match (per package)
|
||||
6. Return list of restored packages
|
||||
|
||||
**Exception Handling**:
|
||||
- Individual package installation failure: Warning log + continue
|
||||
|
||||
**Validation**:
|
||||
- Verify package restoration by restore policy
|
||||
- Verify reinstallation on version mismatch
|
||||
- Verify continued processing of other packages even on restoration failure
|
||||
|
||||
---
|
||||
|
||||
## 4. Testing Strategy
|
||||
|
||||
### 4.1 Unit Tests
|
||||
|
||||
#### Policy Loading Tests
|
||||
```python
|
||||
def test_get_pip_policy_empty():
|
||||
"""Returns empty dictionary when policy files don't exist"""
|
||||
|
||||
def test_get_pip_policy_merge():
|
||||
"""Correctly merges base and user policies"""
|
||||
|
||||
def test_get_pip_policy_cache():
|
||||
"""Uses cache on second call"""
|
||||
```
|
||||
|
||||
#### Package Parsing Tests
|
||||
```python
|
||||
def test_parse_package_spec_simple():
|
||||
"""'numpy' → ('numpy', None)"""
|
||||
|
||||
def test_parse_package_spec_version():
|
||||
"""'numpy==1.26.0' → ('numpy', '==1.26.0')"""
|
||||
|
||||
def test_parse_package_spec_range():
|
||||
"""'pandas>=2.0.0' → ('pandas', '>=2.0.0')"""
|
||||
|
||||
def test_parse_package_spec_invalid():
|
||||
"""Invalid format → ValueError"""
|
||||
```
|
||||
|
||||
#### Condition Evaluation Tests
|
||||
```python
|
||||
def test_evaluate_condition_none():
|
||||
"""None condition → True"""
|
||||
|
||||
def test_evaluate_condition_installed():
|
||||
"""Evaluates installed package condition"""
|
||||
|
||||
def test_evaluate_condition_platform():
|
||||
"""Evaluates platform condition"""
|
||||
```
|
||||
|
||||
### 4.2 Integration Tests
|
||||
|
||||
#### Installation Policy Tests
|
||||
```python
|
||||
def test_install_with_skip_policy():
|
||||
"""Blocks installation with skip policy"""
|
||||
|
||||
def test_install_with_force_version():
|
||||
"""Changes version with force_version policy"""
|
||||
|
||||
def test_install_with_replace():
|
||||
"""Replaces package with replace policy"""
|
||||
|
||||
def test_install_with_pin_dependencies():
|
||||
"""Pins versions with pin_dependencies"""
|
||||
```
|
||||
|
||||
#### Batch Operation Tests
|
||||
```python
|
||||
def test_ensure_not_installed():
|
||||
"""Removes packages with uninstall policy"""
|
||||
|
||||
def test_ensure_installed():
|
||||
"""Restores packages with restore policy"""
|
||||
|
||||
def test_batch_workflow():
|
||||
"""Tests complete batch workflow"""
|
||||
```
|
||||
|
||||
### 4.3 Edge Case Tests
|
||||
|
||||
```python
|
||||
def test_install_without_policy():
|
||||
"""Default installation for packages without policy"""
|
||||
|
||||
def test_install_override_policy():
|
||||
"""Ignores policy with override_policy=True"""
|
||||
|
||||
def test_pip_freeze_failure():
|
||||
"""Handles empty cache on pip freeze failure"""
|
||||
|
||||
def test_json_parse_error():
|
||||
"""Handles malformed JSON files"""
|
||||
|
||||
def test_subprocess_failure():
|
||||
"""Exception handling when pip command fails"""
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. Error Handling Strategy
|
||||
|
||||
### 5.1 Policy Loading Errors
|
||||
- **File not found**: Warning log + empty dictionary
|
||||
- **JSON parse failure**: Error log + empty dictionary
|
||||
- **No read permission**: Warning log + empty dictionary
|
||||
|
||||
### 5.2 Package Installation Errors
|
||||
- **pip command failure**: Depends on on_failure setting
|
||||
- `retry_without_pin`: Retry
|
||||
- `fail`: Raise exception
|
||||
- Other: Warning log
|
||||
- **Invalid package spec**: Raise ValueError
|
||||
|
||||
### 5.3 Batch Operation Errors
|
||||
- **Individual package failure**: Warning log + continue to next package
|
||||
- **pip freeze failure**: Empty dictionary + warning log
|
||||
|
||||
---
|
||||
|
||||
## 6. Performance Optimization
|
||||
|
||||
### 6.1 Caching Strategy
|
||||
- **Policy cache**: Reused program-wide via global variable
|
||||
- **pip freeze cache**: Reused per batch, invalidated after install/remove
|
||||
- **lazy loading**: Load only when needed
|
||||
|
||||
### 6.2 Parallel Processing Considerations
|
||||
- Current implementation is not thread-safe
|
||||
- Consider adding threading.Lock if needed
|
||||
- Batch operations execute sequentially only
|
||||
|
||||
---
|
||||
|
||||
## 7. Documentation Requirements
|
||||
|
||||
### 7.1 Code Documentation
|
||||
- Docstrings required for all public methods
|
||||
- Specify parameters, return values, and exceptions
|
||||
- Include usage examples
|
||||
|
||||
### 7.2 User Guide
|
||||
- Explain `pip-policy.json` structure
|
||||
- Policy writing examples
|
||||
- Usage pattern examples
|
||||
|
||||
### 7.3 Developer Guide
|
||||
- Architecture explanation
|
||||
- Extension methods
|
||||
- Test execution methods
|
||||
|
||||
---
|
||||
|
||||
## 8. Deployment Checklist
|
||||
|
||||
### 8.1 Code Quality
|
||||
- [ ] All unit tests pass
|
||||
- [ ] All integration tests pass
|
||||
- [ ] Code coverage ≥80%
|
||||
- [ ] No linting errors (flake8, pylint)
|
||||
- [ ] Type hints complete (mypy passes)
|
||||
|
||||
### 8.2 Documentation
|
||||
- [ ] README.md written
|
||||
- [ ] API documentation generated
|
||||
- [ ] Example policy files written
|
||||
- [ ] Usage guide written
|
||||
|
||||
### 8.3 Performance Verification
|
||||
- [ ] Policy loading performance measured (<100ms)
|
||||
- [ ] pip freeze caching effectiveness verified (≥50% speed improvement)
|
||||
- [ ] Memory usage confirmed (<10MB)
|
||||
|
||||
### 8.4 Security Verification
|
||||
- [ ] Input validation complete
|
||||
- [ ] Path traversal prevention
|
||||
- [ ] Command injection prevention
|
||||
- [ ] JSON parsing safety confirmed
|
||||
|
||||
---
|
||||
|
||||
## 9. Future Improvements
|
||||
|
||||
### 9.1 Short-term (1-2 weeks)
|
||||
- Implement ComfyUI version check
|
||||
- Implement user confirmation prompt (allow_continue=false)
|
||||
- Thread-safe improvements (add Lock)
|
||||
|
||||
### 9.2 Mid-term (1-2 months)
|
||||
- Add policy validation tools
|
||||
- Policy migration tools
|
||||
- More detailed logging and debugging options
|
||||
|
||||
### 9.3 Long-term (3-6 months)
|
||||
- Web UI for policy management
|
||||
- Provide policy templates
|
||||
- Community policy sharing system
|
||||
|
||||
---
|
||||
|
||||
## 10. Risks and Mitigation Strategies
|
||||
|
||||
### Risk 1: Policy Conflicts
|
||||
**Description**: Policies for different packages may conflict
|
||||
**Mitigation**: Develop policy validation tools, conflict detection algorithm
|
||||
|
||||
### Risk 2: pip Version Compatibility
|
||||
**Description**: Must work across various pip versions
|
||||
**Mitigation**: Test on multiple pip versions, version-specific branching
|
||||
|
||||
### Risk 3: Performance Degradation
|
||||
**Description**: Installation speed may decrease due to policy evaluation
|
||||
**Mitigation**: Optimize caching, minimize condition evaluation
|
||||
|
||||
### Risk 4: Policy Misconfiguration
|
||||
**Description**: Users may write incorrect policies
|
||||
**Mitigation**: JSON schema validation, provide examples and guides
|
||||
|
||||
---
|
||||
|
||||
## 11. Timeline
|
||||
|
||||
### Week 1
|
||||
- Phase 1: Core Infrastructure Setup (Day 1-2)
|
||||
- Phase 2: Caching and Utility Methods (Day 3-4)
|
||||
- Write unit tests (Day 5)
|
||||
|
||||
### Week 2
|
||||
- Phase 3: Core Installation Logic Implementation (Day 1-3)
|
||||
- Phase 4: Batch Operation Methods Implementation (Day 4-5)
|
||||
|
||||
### Week 3
|
||||
- Integration and edge case testing (Day 1-2)
|
||||
- Documentation (Day 3)
|
||||
- Code review and refactoring (Day 4-5)
|
||||
|
||||
### Week 4
|
||||
- Performance optimization (Day 1-2)
|
||||
- Security verification (Day 3)
|
||||
- Final testing and deployment preparation (Day 4-5)
|
||||
|
||||
---
|
||||
|
||||
## 12. Success Criteria
|
||||
|
||||
### Feature Completeness
|
||||
- ✅ All policy types (uninstall, apply_first_match, apply_all_matches, restore) work correctly
|
||||
- ✅ Policy merge logic works correctly
|
||||
- ✅ Batch operations perform normally
|
||||
|
||||
### Quality Metrics
|
||||
- ✅ Test coverage ≥80%
|
||||
- ✅ All tests pass
|
||||
- ✅ 0 linting errors
|
||||
- ✅ 100% type hint completion
|
||||
|
||||
### Performance Metrics
|
||||
- ✅ Policy loading <100ms
|
||||
- ✅ ≥50% performance improvement with pip freeze caching
|
||||
- ✅ Memory usage <10MB
|
||||
|
||||
### Usability
|
||||
- ✅ Clear error messages
|
||||
- ✅ Sufficient documentation
|
||||
- ✅ Verified in real-world use cases
|
||||
629
comfyui_manager/common/pip_util.py
Normal file
629
comfyui_manager/common/pip_util.py
Normal file
@@ -0,0 +1,629 @@
|
||||
"""
|
||||
pip_util - Policy-based pip package management system
|
||||
|
||||
This module provides a policy-based approach to pip package installation
|
||||
to minimize dependency conflicts and protect existing installed packages.
|
||||
|
||||
Usage:
|
||||
# Batch operations (policy auto-loaded)
|
||||
with PipBatch() as batch:
|
||||
batch.ensure_not_installed()
|
||||
batch.install("numpy>=1.20")
|
||||
batch.install("pandas>=2.0")
|
||||
batch.install("scipy>=1.7")
|
||||
batch.ensure_installed()
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import platform
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
from packaging.requirements import Requirement
|
||||
from packaging.specifiers import SpecifierSet
|
||||
from packaging.version import Version
|
||||
|
||||
from . import manager_util, context
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Global policy cache (lazy loaded on first access)
|
||||
_pip_policy_cache: Optional[Dict] = None
|
||||
|
||||
|
||||
def get_pip_policy() -> Dict:
|
||||
"""
|
||||
Get pip policy with lazy loading.
|
||||
|
||||
Returns the cached policy if available, otherwise loads it from files.
|
||||
This function automatically loads the policy on first access.
|
||||
|
||||
Thread safety: This function is NOT thread-safe.
|
||||
Ensure single-threaded access during initialization.
|
||||
|
||||
Returns:
|
||||
Dictionary of merged pip policies
|
||||
|
||||
Example:
|
||||
>>> policy = get_pip_policy()
|
||||
>>> numpy_policy = policy.get("numpy", {})
|
||||
"""
|
||||
global _pip_policy_cache
|
||||
|
||||
# Return cached policy if already loaded
|
||||
if _pip_policy_cache is not None:
|
||||
logger.debug("Returning cached pip policy")
|
||||
return _pip_policy_cache
|
||||
|
||||
logger.info("Loading pip policies...")
|
||||
|
||||
# Load base policy
|
||||
base_policy = {}
|
||||
base_policy_path = Path(manager_util.comfyui_manager_path) / "pip-policy.json"
|
||||
|
||||
try:
|
||||
if base_policy_path.exists():
|
||||
with open(base_policy_path, 'r', encoding='utf-8') as f:
|
||||
base_policy = json.load(f)
|
||||
logger.debug(f"Loaded base policy from {base_policy_path}")
|
||||
else:
|
||||
logger.warning(f"Base policy file not found: {base_policy_path}")
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"Failed to parse base policy JSON: {e}")
|
||||
base_policy = {}
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to read base policy file: {e}")
|
||||
base_policy = {}
|
||||
|
||||
# Load user policy
|
||||
user_policy = {}
|
||||
user_policy_path = Path(context.manager_files_path) / "pip-policy.user.json"
|
||||
|
||||
try:
|
||||
if user_policy_path.exists():
|
||||
with open(user_policy_path, 'r', encoding='utf-8') as f:
|
||||
user_policy = json.load(f)
|
||||
logger.debug(f"Loaded user policy from {user_policy_path}")
|
||||
else:
|
||||
# Create empty user policy file
|
||||
user_policy_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(user_policy_path, 'w', encoding='utf-8') as f:
|
||||
json.dump({"_comment": "User-specific pip policy overrides"}, f, indent=2)
|
||||
logger.info(f"Created empty user policy file: {user_policy_path}")
|
||||
except json.JSONDecodeError as e:
|
||||
logger.warning(f"Failed to parse user policy JSON: {e}")
|
||||
user_policy = {}
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to read user policy file: {e}")
|
||||
user_policy = {}
|
||||
|
||||
# Merge policies (package-level override: user completely replaces base per package)
|
||||
merged_policy = base_policy.copy()
|
||||
for package_name, package_policy in user_policy.items():
|
||||
if package_name.startswith("_"): # Skip metadata fields like _comment
|
||||
continue
|
||||
merged_policy[package_name] = package_policy # Complete package replacement
|
||||
|
||||
# Store in global cache
|
||||
_pip_policy_cache = merged_policy
|
||||
logger.info(f"Policy loaded successfully: {len(_pip_policy_cache)} package policies")
|
||||
|
||||
return _pip_policy_cache
|
||||
|
||||
|
||||
class PipBatch:
|
||||
"""
|
||||
Pip package installation batch manager.
|
||||
|
||||
Maintains pip freeze cache during a batch of operations for performance optimization.
|
||||
|
||||
Usage pattern:
|
||||
# Batch operations (policy auto-loaded)
|
||||
with PipBatch() as batch:
|
||||
batch.ensure_not_installed()
|
||||
batch.install("numpy>=1.20")
|
||||
batch.install("pandas>=2.0")
|
||||
batch.install("scipy>=1.7")
|
||||
batch.ensure_installed()
|
||||
|
||||
Attributes:
|
||||
_installed_cache: Cache of installed packages from pip freeze
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize PipBatch with empty cache."""
|
||||
self._installed_cache: Optional[Dict[str, str]] = None
|
||||
|
||||
def __enter__(self):
|
||||
"""Enter context manager."""
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""Exit context manager and clear cache."""
|
||||
self._installed_cache = None
|
||||
return False
|
||||
|
||||
def _refresh_installed_cache(self) -> None:
|
||||
"""
|
||||
Refresh the installed packages cache by executing pip freeze.
|
||||
|
||||
Parses pip freeze output into a dictionary of {package_name: version}.
|
||||
Ignores editable packages and comments.
|
||||
|
||||
Raises:
|
||||
No exceptions raised - failures result in empty cache with warning log
|
||||
"""
|
||||
try:
|
||||
cmd = manager_util.make_pip_cmd(["freeze"])
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, check=True)
|
||||
|
||||
packages = {}
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
line = line.strip()
|
||||
|
||||
# Skip empty lines
|
||||
if not line:
|
||||
continue
|
||||
|
||||
# Skip editable packages (-e /path/to/package or -e git+https://...)
|
||||
# Editable packages don't have version info and are typically development-only
|
||||
if line.startswith('-e '):
|
||||
continue
|
||||
|
||||
# Skip comments (defensive: pip freeze typically doesn't output comments,
|
||||
# but this handles manually edited requirements.txt or future pip changes)
|
||||
if line.startswith('#'):
|
||||
continue
|
||||
|
||||
# Parse package==version
|
||||
if '==' in line:
|
||||
try:
|
||||
package_name, version = line.split('==', 1)
|
||||
packages[package_name.strip()] = version.strip()
|
||||
except ValueError:
|
||||
logger.warning(f"Failed to parse pip freeze line: {line}")
|
||||
continue
|
||||
|
||||
self._installed_cache = packages
|
||||
logger.debug(f"Refreshed installed packages cache: {len(packages)} packages")
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"pip freeze failed: {e}")
|
||||
self._installed_cache = {}
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to refresh installed packages cache: {e}")
|
||||
self._installed_cache = {}
|
||||
|
||||
def _get_installed_packages(self) -> Dict[str, str]:
|
||||
"""
|
||||
Get cached installed packages, refresh if cache is None.
|
||||
|
||||
Returns:
|
||||
Dictionary of {package_name: version}
|
||||
"""
|
||||
if self._installed_cache is None:
|
||||
self._refresh_installed_cache()
|
||||
return self._installed_cache
|
||||
|
||||
def _invalidate_cache(self) -> None:
|
||||
"""
|
||||
Invalidate the installed packages cache.
|
||||
|
||||
Should be called after install/uninstall operations.
|
||||
"""
|
||||
self._installed_cache = None
|
||||
|
||||
def _parse_package_spec(self, package_info: str) -> Tuple[str, Optional[str]]:
|
||||
"""
|
||||
Parse package spec string into package name and version spec using PEP 508.
|
||||
|
||||
Uses the packaging library to properly parse package specifications according to
|
||||
PEP 508 standard, which handles complex cases like extras and multiple version
|
||||
constraints that simple regex cannot handle correctly.
|
||||
|
||||
Args:
|
||||
package_info: Package specification like "numpy", "numpy==1.26.0", "numpy>=1.20.0",
|
||||
or complex specs like "package[extra]>=1.0,<2.0"
|
||||
|
||||
Returns:
|
||||
Tuple of (package_name, version_spec)
|
||||
Examples: ("numpy", "==1.26.0"), ("pandas", ">=2.0.0"), ("scipy", None)
|
||||
Package names are normalized (e.g., "NumPy" -> "numpy")
|
||||
|
||||
Raises:
|
||||
ValueError: If package_info cannot be parsed according to PEP 508
|
||||
|
||||
Example:
|
||||
>>> batch._parse_package_spec("numpy>=1.20")
|
||||
("numpy", ">=1.20")
|
||||
>>> batch._parse_package_spec("requests[security]>=2.0,<3.0")
|
||||
("requests", ">=2.0,<3.0")
|
||||
"""
|
||||
try:
|
||||
req = Requirement(package_info)
|
||||
package_name = req.name # Normalized package name
|
||||
version_spec = str(req.specifier) if req.specifier else None
|
||||
return package_name, version_spec
|
||||
except Exception as e:
|
||||
raise ValueError(f"Invalid package spec: {package_info}") from e
|
||||
|
||||
def _evaluate_condition(self, condition: Optional[Dict], package_name: str,
|
||||
installed_packages: Dict[str, str]) -> bool:
|
||||
"""
|
||||
Evaluate policy condition and return whether it's satisfied.
|
||||
|
||||
Args:
|
||||
condition: Policy condition object (dict) or None
|
||||
package_name: Current package being processed
|
||||
installed_packages: Dictionary of {package_name: version}
|
||||
|
||||
Returns:
|
||||
True if condition is satisfied, False otherwise
|
||||
None condition always returns True
|
||||
|
||||
Example:
|
||||
>>> condition = {"type": "installed", "package": "numpy", "spec": ">=1.20"}
|
||||
>>> batch._evaluate_condition(condition, "numba", {"numpy": "1.26.0"})
|
||||
True
|
||||
"""
|
||||
# No condition means always satisfied
|
||||
if condition is None:
|
||||
return True
|
||||
|
||||
condition_type = condition.get("type")
|
||||
|
||||
if condition_type == "installed":
|
||||
# Check if a package is installed with optional version spec
|
||||
target_package = condition.get("package", package_name)
|
||||
installed_version = installed_packages.get(target_package)
|
||||
|
||||
# Package not installed
|
||||
if installed_version is None:
|
||||
return False
|
||||
|
||||
# Check version spec if provided
|
||||
spec = condition.get("spec")
|
||||
if spec:
|
||||
try:
|
||||
specifier = SpecifierSet(spec)
|
||||
return Version(installed_version) in specifier
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to compare version {installed_version} with spec {spec}: {e}")
|
||||
return False
|
||||
|
||||
# Package is installed (no spec check)
|
||||
return True
|
||||
|
||||
elif condition_type == "platform":
|
||||
# Check platform conditions (os, has_gpu, comfyui_version)
|
||||
conditions_met = True
|
||||
|
||||
# Check OS
|
||||
if "os" in condition:
|
||||
expected_os = condition["os"].lower()
|
||||
actual_os = platform.system().lower()
|
||||
if expected_os not in actual_os and actual_os not in expected_os:
|
||||
conditions_met = False
|
||||
|
||||
# Check GPU availability
|
||||
if "has_gpu" in condition:
|
||||
expected_gpu = condition["has_gpu"]
|
||||
try:
|
||||
import torch
|
||||
has_gpu = torch.cuda.is_available()
|
||||
except ImportError:
|
||||
has_gpu = False
|
||||
|
||||
if expected_gpu != has_gpu:
|
||||
conditions_met = False
|
||||
|
||||
# Check ComfyUI version
|
||||
if "comfyui_version" in condition:
|
||||
# TODO: Implement ComfyUI version check
|
||||
logger.warning("ComfyUI version condition not yet implemented")
|
||||
|
||||
return conditions_met
|
||||
|
||||
else:
|
||||
logger.warning(f"Unknown condition type: {condition_type}")
|
||||
return False
|
||||
|
||||
def install(self, package_info: str, extra_index_url: Optional[str] = None,
|
||||
override_policy: bool = False) -> bool:
|
||||
"""
|
||||
Install a pip package with policy-based modifications.
|
||||
|
||||
Args:
|
||||
package_info: Package specification (e.g., "numpy", "numpy==1.26.0", "numpy>=1.20.0")
|
||||
extra_index_url: Additional package repository URL (optional)
|
||||
override_policy: If True, skip policy application and install directly (default: False)
|
||||
|
||||
Returns:
|
||||
True if installation succeeded, False if skipped by policy
|
||||
|
||||
Raises:
|
||||
ValueError: If package_info cannot be parsed
|
||||
subprocess.CalledProcessError: If installation fails (depending on policy on_failure settings)
|
||||
|
||||
Example:
|
||||
>>> with PipBatch() as batch:
|
||||
... batch.install("numpy>=1.20")
|
||||
... batch.install("torch", override_policy=True)
|
||||
"""
|
||||
# Parse package spec
|
||||
try:
|
||||
package_name, version_spec = self._parse_package_spec(package_info)
|
||||
except ValueError as e:
|
||||
logger.error(f"Invalid package spec: {e}")
|
||||
raise
|
||||
|
||||
# Get installed packages cache
|
||||
installed_packages = self._get_installed_packages()
|
||||
|
||||
# Override policy - skip to direct installation
|
||||
if override_policy:
|
||||
logger.info(f"Installing {package_info} (policy override)")
|
||||
cmd = manager_util.make_pip_cmd(["install", package_info])
|
||||
if extra_index_url:
|
||||
cmd.extend(["--extra-index-url", extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
logger.info(f"Successfully installed {package_info}")
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.error(f"Failed to install {package_info}: {e}")
|
||||
raise
|
||||
|
||||
# Get policy (lazy loading)
|
||||
pip_policy = get_pip_policy()
|
||||
policy = pip_policy.get(package_name, {})
|
||||
|
||||
# If no policy, proceed with default installation
|
||||
if not policy:
|
||||
logger.debug(f"No policy found for {package_name}, proceeding with default installation")
|
||||
cmd = manager_util.make_pip_cmd(["install", package_info])
|
||||
if extra_index_url:
|
||||
cmd.extend(["--extra-index-url", extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
logger.info(f"Successfully installed {package_info}")
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.error(f"Failed to install {package_info}: {e}")
|
||||
raise
|
||||
|
||||
# Apply apply_first_match policies (exclusive - first match only)
|
||||
final_package_info = package_info
|
||||
final_extra_index_url = extra_index_url
|
||||
policy_reason = None
|
||||
|
||||
apply_first_match = policy.get("apply_first_match", [])
|
||||
for policy_item in apply_first_match:
|
||||
condition = policy_item.get("condition")
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
policy_type = policy_item.get("type")
|
||||
|
||||
if policy_type == "skip":
|
||||
reason = policy_item.get("reason", "No reason provided")
|
||||
logger.info(f"Skipping installation of {package_name}: {reason}")
|
||||
return False
|
||||
|
||||
elif policy_type == "force_version":
|
||||
forced_version = policy_item.get("version")
|
||||
final_package_info = f"{package_name}=={forced_version}"
|
||||
policy_reason = policy_item.get("reason")
|
||||
if "extra_index_url" in policy_item:
|
||||
final_extra_index_url = policy_item["extra_index_url"]
|
||||
logger.info(f"Force version for {package_name}: {forced_version} ({policy_reason})")
|
||||
break # First match only
|
||||
|
||||
elif policy_type == "replace":
|
||||
replacement = policy_item.get("replacement")
|
||||
replacement_version = policy_item.get("version", "")
|
||||
if replacement_version:
|
||||
final_package_info = f"{replacement}{replacement_version}"
|
||||
else:
|
||||
final_package_info = replacement
|
||||
policy_reason = policy_item.get("reason")
|
||||
if "extra_index_url" in policy_item:
|
||||
final_extra_index_url = policy_item["extra_index_url"]
|
||||
logger.info(f"Replacing {package_name} with {final_package_info}: {policy_reason}")
|
||||
break # First match only
|
||||
|
||||
# Apply apply_all_matches policies (cumulative - all matches)
|
||||
additional_packages = []
|
||||
pinned_packages = []
|
||||
pin_on_failure = "fail"
|
||||
|
||||
apply_all_matches = policy.get("apply_all_matches", [])
|
||||
for policy_item in apply_all_matches:
|
||||
condition = policy_item.get("condition")
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
policy_type = policy_item.get("type")
|
||||
|
||||
if policy_type == "pin_dependencies":
|
||||
pin_list = policy_item.get("pinned_packages", [])
|
||||
for pkg in pin_list:
|
||||
installed_version = installed_packages.get(pkg)
|
||||
if installed_version:
|
||||
pinned_packages.append(f"{pkg}=={installed_version}")
|
||||
else:
|
||||
logger.warning(f"Cannot pin {pkg}: not currently installed")
|
||||
pin_on_failure = policy_item.get("on_failure", "fail")
|
||||
reason = policy_item.get("reason", "")
|
||||
logger.info(f"Pinning dependencies: {pinned_packages} ({reason})")
|
||||
|
||||
elif policy_type == "install_with":
|
||||
additional = policy_item.get("additional_packages", [])
|
||||
additional_packages.extend(additional)
|
||||
reason = policy_item.get("reason", "")
|
||||
logger.info(f"Installing additional packages: {additional} ({reason})")
|
||||
|
||||
elif policy_type == "warn":
|
||||
message = policy_item.get("message", "")
|
||||
allow_continue = policy_item.get("allow_continue", True)
|
||||
logger.warning(f"Policy warning for {package_name}: {message}")
|
||||
if not allow_continue:
|
||||
# TODO: Implement user confirmation
|
||||
logger.info("User confirmation required (not implemented, continuing)")
|
||||
|
||||
# Build final package list
|
||||
packages_to_install = [final_package_info] + pinned_packages + additional_packages
|
||||
|
||||
# Execute installation
|
||||
cmd = manager_util.make_pip_cmd(["install"] + packages_to_install)
|
||||
if final_extra_index_url:
|
||||
cmd.extend(["--extra-index-url", final_extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
if policy_reason:
|
||||
logger.info(f"Successfully installed {final_package_info}: {policy_reason}")
|
||||
else:
|
||||
logger.info(f"Successfully installed {final_package_info}")
|
||||
return True
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
# Handle installation failure
|
||||
if pinned_packages and pin_on_failure == "retry_without_pin":
|
||||
logger.warning(f"Installation failed with pinned dependencies, retrying without pins")
|
||||
retry_cmd = manager_util.make_pip_cmd(["install", final_package_info])
|
||||
if final_extra_index_url:
|
||||
retry_cmd.extend(["--extra-index-url", final_extra_index_url])
|
||||
|
||||
try:
|
||||
subprocess.run(retry_cmd, check=True)
|
||||
self._invalidate_cache()
|
||||
logger.info(f"Successfully installed {final_package_info} (without pins)")
|
||||
return True
|
||||
except subprocess.CalledProcessError as retry_error:
|
||||
logger.error(f"Retry installation also failed: {retry_error}")
|
||||
raise
|
||||
|
||||
elif pin_on_failure == "fail":
|
||||
logger.error(f"Installation failed: {e}")
|
||||
raise
|
||||
|
||||
else:
|
||||
logger.warning(f"Installation failed, but continuing: {e}")
|
||||
return False
|
||||
|
||||
def ensure_not_installed(self) -> List[str]:
|
||||
"""
|
||||
Remove all packages matching uninstall policies (batch processing).
|
||||
|
||||
Iterates through all package policies and executes uninstall actions
|
||||
where conditions are satisfied.
|
||||
|
||||
Returns:
|
||||
List of removed package names
|
||||
|
||||
Example:
|
||||
>>> with PipBatch() as batch:
|
||||
... removed = batch.ensure_not_installed()
|
||||
... print(f"Removed: {removed}")
|
||||
"""
|
||||
# Get policy (lazy loading)
|
||||
pip_policy = get_pip_policy()
|
||||
|
||||
installed_packages = self._get_installed_packages()
|
||||
removed_packages = []
|
||||
|
||||
for package_name, policy in pip_policy.items():
|
||||
uninstall_policies = policy.get("uninstall", [])
|
||||
|
||||
for uninstall_policy in uninstall_policies:
|
||||
condition = uninstall_policy.get("condition")
|
||||
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
target = uninstall_policy.get("target")
|
||||
reason = uninstall_policy.get("reason", "No reason provided")
|
||||
|
||||
# Check if target is installed
|
||||
if target in installed_packages:
|
||||
try:
|
||||
cmd = manager_util.make_pip_cmd(["uninstall", "-y", target])
|
||||
subprocess.run(cmd, check=True)
|
||||
|
||||
logger.info(f"Uninstalled {target}: {reason}")
|
||||
removed_packages.append(target)
|
||||
|
||||
# Remove from cache
|
||||
del installed_packages[target]
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Failed to uninstall {target}: {e}")
|
||||
|
||||
# First match only per package
|
||||
break
|
||||
|
||||
return removed_packages
|
||||
|
||||
def ensure_installed(self) -> List[str]:
|
||||
"""
|
||||
Restore all packages matching restore policies (batch processing).
|
||||
|
||||
Iterates through all package policies and executes restore actions
|
||||
where conditions are satisfied.
|
||||
|
||||
Returns:
|
||||
List of restored package names
|
||||
|
||||
Example:
|
||||
>>> with PipBatch() as batch:
|
||||
... batch.install("numpy>=1.20")
|
||||
... restored = batch.ensure_installed()
|
||||
... print(f"Restored: {restored}")
|
||||
"""
|
||||
# Get policy (lazy loading)
|
||||
pip_policy = get_pip_policy()
|
||||
|
||||
installed_packages = self._get_installed_packages()
|
||||
restored_packages = []
|
||||
|
||||
for package_name, policy in pip_policy.items():
|
||||
restore_policies = policy.get("restore", [])
|
||||
|
||||
for restore_policy in restore_policies:
|
||||
condition = restore_policy.get("condition")
|
||||
|
||||
if self._evaluate_condition(condition, package_name, installed_packages):
|
||||
target = restore_policy.get("target")
|
||||
version = restore_policy.get("version")
|
||||
reason = restore_policy.get("reason", "No reason provided")
|
||||
extra_index_url = restore_policy.get("extra_index_url")
|
||||
|
||||
# Check if target needs restoration
|
||||
current_version = installed_packages.get(target)
|
||||
|
||||
if current_version is None or current_version != version:
|
||||
try:
|
||||
package_spec = f"{target}=={version}"
|
||||
cmd = manager_util.make_pip_cmd(["install", package_spec])
|
||||
|
||||
if extra_index_url:
|
||||
cmd.extend(["--extra-index-url", extra_index_url])
|
||||
|
||||
subprocess.run(cmd, check=True)
|
||||
|
||||
logger.info(f"Restored {package_spec}: {reason}")
|
||||
restored_packages.append(target)
|
||||
|
||||
# Update cache
|
||||
installed_packages[target] = version
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Failed to restore {target}: {e}")
|
||||
|
||||
# First match only per package
|
||||
break
|
||||
|
||||
return restored_packages
|
||||
2916
comfyui_manager/common/pip_util.test-design.md
Normal file
2916
comfyui_manager/common/pip_util.test-design.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -2,6 +2,8 @@ import sys
|
||||
import subprocess
|
||||
import os
|
||||
|
||||
from . import manager_util
|
||||
|
||||
|
||||
def security_check():
|
||||
print("[START] Security scan")
|
||||
@@ -66,18 +68,23 @@ https://blog.comfy.org/comfyui-statement-on-the-ultralytics-crypto-miner-situati
|
||||
"lolMiner": [os.path.join(comfyui_path, 'lolMiner')]
|
||||
}
|
||||
|
||||
installed_pips = subprocess.check_output([sys.executable, '-m', "pip", "freeze"], text=True)
|
||||
installed_pips = subprocess.check_output(manager_util.make_pip_cmd(["freeze"]), text=True)
|
||||
|
||||
detected = set()
|
||||
try:
|
||||
anthropic_info = subprocess.check_output([sys.executable, '-m', "pip", "show", "anthropic"], text=True, stderr=subprocess.DEVNULL)
|
||||
anthropic_reqs = [x for x in anthropic_info.split('\n') if x.startswith("Requires")][0].split(': ')[1]
|
||||
if "pycrypto" in anthropic_reqs:
|
||||
location = [x for x in anthropic_info.split('\n') if x.startswith("Location")][0].split(': ')[1]
|
||||
for fi in os.listdir(location):
|
||||
if fi.startswith("anthropic"):
|
||||
guide["ComfyUI_LLMVISION"] = f"\n0.Remove {os.path.join(location, fi)}" + guide["ComfyUI_LLMVISION"]
|
||||
detected.add("ComfyUI_LLMVISION")
|
||||
anthropic_info = subprocess.check_output(manager_util.make_pip_cmd(["show", "anthropic"]), text=True, stderr=subprocess.DEVNULL)
|
||||
requires_lines = [x for x in anthropic_info.split('\n') if x.startswith("Requires")]
|
||||
if requires_lines:
|
||||
anthropic_reqs = requires_lines[0].split(": ", 1)[1]
|
||||
if "pycrypto" in anthropic_reqs:
|
||||
location_lines = [x for x in anthropic_info.split('\n') if x.startswith("Location")]
|
||||
if location_lines:
|
||||
location = location_lines[0].split(": ", 1)[1]
|
||||
for fi in os.listdir(location):
|
||||
if fi.startswith("anthropic"):
|
||||
guide["ComfyUI_LLMVISION"] = (f"\n0.Remove {os.path.join(location, fi)}" + guide["ComfyUI_LLMVISION"])
|
||||
detected.add("ComfyUI_LLMVISION")
|
||||
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
|
||||
10157
custom-node-list.json → comfyui_manager/custom-node-list.json
Executable file → Normal file
10157
custom-node-list.json → comfyui_manager/custom-node-list.json
Executable file → Normal file
File diff suppressed because it is too large
Load Diff
68
comfyui_manager/data_models/README.md
Normal file
68
comfyui_manager/data_models/README.md
Normal file
@@ -0,0 +1,68 @@
|
||||
# Data Models
|
||||
|
||||
This directory contains Pydantic models for ComfyUI Manager, providing type safety, validation, and serialization for the API and internal data structures.
|
||||
|
||||
## Overview
|
||||
|
||||
- `generated_models.py` - All models auto-generated from OpenAPI spec
|
||||
- `__init__.py` - Package exports for all models
|
||||
|
||||
**Note**: All models are now auto-generated from the OpenAPI specification. Manual model files (`task_queue.py`, `state_management.py`) have been deprecated in favor of a single source of truth.
|
||||
|
||||
## Generating Types from OpenAPI
|
||||
|
||||
The state management models are automatically generated from the OpenAPI specification using `datamodel-codegen`. This ensures type safety and consistency between the API specification and the Python code.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Install the code generator:
|
||||
```bash
|
||||
pipx install datamodel-code-generator
|
||||
```
|
||||
|
||||
### Generation Command
|
||||
|
||||
To regenerate all models after updating the OpenAPI spec:
|
||||
|
||||
```bash
|
||||
datamodel-codegen \
|
||||
--use-subclass-enum \
|
||||
--field-constraints \
|
||||
--strict-types bytes \
|
||||
--use-double-quotes \
|
||||
--input openapi.yaml \
|
||||
--output comfyui_manager/data_models/generated_models.py \
|
||||
--output-model-type pydantic_v2.BaseModel
|
||||
```
|
||||
|
||||
### When to Regenerate
|
||||
|
||||
You should regenerate the models when:
|
||||
|
||||
1. **Adding new API endpoints** that return new data structures
|
||||
2. **Modifying existing schemas** in the OpenAPI specification
|
||||
3. **Adding new state management features** that require new models
|
||||
|
||||
### Important Notes
|
||||
|
||||
- **Single source of truth**: All models are now generated from `openapi.yaml`
|
||||
- **No manual models**: All previously manual models have been migrated to the OpenAPI spec
|
||||
- **OpenAPI requirements**: New schemas must be referenced in API paths to be generated by datamodel-codegen
|
||||
- **Validation**: Always validate the OpenAPI spec before generation:
|
||||
```bash
|
||||
python3 -c "import yaml; yaml.safe_load(open('openapi.yaml'))"
|
||||
```
|
||||
|
||||
### Example: Adding New State Models
|
||||
|
||||
1. Add your schema to `openapi.yaml` under `components/schemas/`
|
||||
2. Reference the schema in an API endpoint response
|
||||
3. Run the generation command above
|
||||
4. Update `__init__.py` to export the new models
|
||||
5. Import and use the models in your code
|
||||
|
||||
### Troubleshooting
|
||||
|
||||
- **Models not generated**: Ensure schemas are under `components/schemas/` (not `parameters/`)
|
||||
- **Missing models**: Verify schemas are referenced in at least one API path
|
||||
- **Import errors**: Check that new models are added to `__init__.py` exports
|
||||
137
comfyui_manager/data_models/__init__.py
Normal file
137
comfyui_manager/data_models/__init__.py
Normal file
@@ -0,0 +1,137 @@
|
||||
"""
|
||||
Data models for ComfyUI Manager.
|
||||
|
||||
This package contains Pydantic models used throughout the ComfyUI Manager
|
||||
for data validation, serialization, and type safety.
|
||||
|
||||
All models are auto-generated from the OpenAPI specification to ensure
|
||||
consistency between the API and implementation.
|
||||
"""
|
||||
|
||||
from .generated_models import (
|
||||
# Core Task Queue Models
|
||||
QueueTaskItem,
|
||||
TaskHistoryItem,
|
||||
TaskStateMessage,
|
||||
TaskExecutionStatus,
|
||||
|
||||
# WebSocket Message Models
|
||||
MessageTaskDone,
|
||||
MessageTaskStarted,
|
||||
MessageTaskFailed,
|
||||
MessageUpdate,
|
||||
ManagerMessageName,
|
||||
|
||||
# State Management Models
|
||||
BatchExecutionRecord,
|
||||
ComfyUISystemState,
|
||||
BatchOperation,
|
||||
InstalledNodeInfo,
|
||||
InstalledModelInfo,
|
||||
ComfyUIVersionInfo,
|
||||
|
||||
# Import Fail Info Models
|
||||
ImportFailInfoBulkRequest,
|
||||
ImportFailInfoBulkResponse,
|
||||
ImportFailInfoItem,
|
||||
ImportFailInfoItem1,
|
||||
|
||||
# Other models
|
||||
OperationType,
|
||||
OperationResult,
|
||||
ManagerPackInfo,
|
||||
ManagerPackInstalled,
|
||||
SelectedVersion,
|
||||
ManagerChannel,
|
||||
ManagerDatabaseSource,
|
||||
ManagerPackState,
|
||||
ManagerPackInstallType,
|
||||
ManagerPack,
|
||||
InstallPackParams,
|
||||
UpdatePackParams,
|
||||
UpdateAllPacksParams,
|
||||
UpdateComfyUIParams,
|
||||
FixPackParams,
|
||||
UninstallPackParams,
|
||||
DisablePackParams,
|
||||
EnablePackParams,
|
||||
UpdateAllQueryParams,
|
||||
UpdateComfyUIQueryParams,
|
||||
ComfyUISwitchVersionQueryParams,
|
||||
QueueStatus,
|
||||
ManagerMappings,
|
||||
ModelMetadata,
|
||||
NodePackageMetadata,
|
||||
SnapshotItem,
|
||||
Error,
|
||||
InstalledPacksResponse,
|
||||
HistoryResponse,
|
||||
HistoryListResponse,
|
||||
InstallType,
|
||||
SecurityLevel,
|
||||
RiskLevel,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
# Core Task Queue Models
|
||||
"QueueTaskItem",
|
||||
"TaskHistoryItem",
|
||||
"TaskStateMessage",
|
||||
"TaskExecutionStatus",
|
||||
|
||||
# WebSocket Message Models
|
||||
"MessageTaskDone",
|
||||
"MessageTaskStarted",
|
||||
"MessageTaskFailed",
|
||||
"MessageUpdate",
|
||||
"ManagerMessageName",
|
||||
|
||||
# State Management Models
|
||||
"BatchExecutionRecord",
|
||||
"ComfyUISystemState",
|
||||
"BatchOperation",
|
||||
"InstalledNodeInfo",
|
||||
"InstalledModelInfo",
|
||||
"ComfyUIVersionInfo",
|
||||
|
||||
# Import Fail Info Models
|
||||
"ImportFailInfoBulkRequest",
|
||||
"ImportFailInfoBulkResponse",
|
||||
"ImportFailInfoItem",
|
||||
"ImportFailInfoItem1",
|
||||
|
||||
# Other models
|
||||
"OperationType",
|
||||
"OperationResult",
|
||||
"ManagerPackInfo",
|
||||
"ManagerPackInstalled",
|
||||
"SelectedVersion",
|
||||
"ManagerChannel",
|
||||
"ManagerDatabaseSource",
|
||||
"ManagerPackState",
|
||||
"ManagerPackInstallType",
|
||||
"ManagerPack",
|
||||
"InstallPackParams",
|
||||
"UpdatePackParams",
|
||||
"UpdateAllPacksParams",
|
||||
"UpdateComfyUIParams",
|
||||
"FixPackParams",
|
||||
"UninstallPackParams",
|
||||
"DisablePackParams",
|
||||
"EnablePackParams",
|
||||
"UpdateAllQueryParams",
|
||||
"UpdateComfyUIQueryParams",
|
||||
"ComfyUISwitchVersionQueryParams",
|
||||
"QueueStatus",
|
||||
"ManagerMappings",
|
||||
"ModelMetadata",
|
||||
"NodePackageMetadata",
|
||||
"SnapshotItem",
|
||||
"Error",
|
||||
"InstalledPacksResponse",
|
||||
"HistoryResponse",
|
||||
"HistoryListResponse",
|
||||
"InstallType",
|
||||
"SecurityLevel",
|
||||
"RiskLevel",
|
||||
]
|
||||
561
comfyui_manager/data_models/generated_models.py
Normal file
561
comfyui_manager/data_models/generated_models.py
Normal file
@@ -0,0 +1,561 @@
|
||||
# generated by datamodel-codegen:
|
||||
# filename: openapi.yaml
|
||||
# timestamp: 2025-07-31T04:52:26+00:00
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from enum import Enum
|
||||
from typing import Any, Dict, List, Optional, Union
|
||||
|
||||
from pydantic import BaseModel, Field, RootModel
|
||||
|
||||
|
||||
class OperationType(str, Enum):
|
||||
install = "install"
|
||||
uninstall = "uninstall"
|
||||
update = "update"
|
||||
update_comfyui = "update-comfyui"
|
||||
fix = "fix"
|
||||
disable = "disable"
|
||||
enable = "enable"
|
||||
install_model = "install-model"
|
||||
|
||||
|
||||
class OperationResult(str, Enum):
|
||||
success = "success"
|
||||
failed = "failed"
|
||||
skipped = "skipped"
|
||||
error = "error"
|
||||
skip = "skip"
|
||||
|
||||
|
||||
class TaskExecutionStatus(BaseModel):
|
||||
status_str: OperationResult
|
||||
completed: bool = Field(..., description="Whether the task completed")
|
||||
messages: List[str] = Field(..., description="Additional status messages")
|
||||
|
||||
|
||||
class ManagerMessageName(str, Enum):
|
||||
cm_task_completed = "cm-task-completed"
|
||||
cm_task_started = "cm-task-started"
|
||||
cm_queue_status = "cm-queue-status"
|
||||
|
||||
|
||||
class ManagerPackInfo(BaseModel):
|
||||
id: str = Field(
|
||||
...,
|
||||
description="Either github-author/github-repo or name of pack from the registry",
|
||||
)
|
||||
version: str = Field(..., description="Semantic version or Git commit hash")
|
||||
ui_id: Optional[str] = Field(None, description="Task ID - generated internally")
|
||||
|
||||
|
||||
class ManagerPackInstalled(BaseModel):
|
||||
ver: str = Field(
|
||||
...,
|
||||
description="The version of the pack that is installed (Git commit hash or semantic version)",
|
||||
)
|
||||
cnr_id: Optional[str] = Field(
|
||||
None, description="The name of the pack if installed from the registry"
|
||||
)
|
||||
aux_id: Optional[str] = Field(
|
||||
None,
|
||||
description="The name of the pack if installed from github (author/repo-name format)",
|
||||
)
|
||||
enabled: bool = Field(..., description="Whether the pack is enabled")
|
||||
|
||||
|
||||
class SelectedVersion(str, Enum):
|
||||
latest = "latest"
|
||||
nightly = "nightly"
|
||||
|
||||
|
||||
class ManagerChannel(str, Enum):
|
||||
default = "default"
|
||||
recent = "recent"
|
||||
legacy = "legacy"
|
||||
forked = "forked"
|
||||
dev = "dev"
|
||||
tutorial = "tutorial"
|
||||
|
||||
|
||||
class ManagerDatabaseSource(str, Enum):
|
||||
remote = "remote"
|
||||
local = "local"
|
||||
cache = "cache"
|
||||
|
||||
|
||||
class ManagerPackState(str, Enum):
|
||||
installed = "installed"
|
||||
disabled = "disabled"
|
||||
not_installed = "not_installed"
|
||||
import_failed = "import_failed"
|
||||
needs_update = "needs_update"
|
||||
|
||||
|
||||
class ManagerPackInstallType(str, Enum):
|
||||
git_clone = "git-clone"
|
||||
copy = "copy"
|
||||
cnr = "cnr"
|
||||
|
||||
|
||||
class SecurityLevel(str, Enum):
|
||||
strong = "strong"
|
||||
normal = "normal"
|
||||
normal_ = "normal-"
|
||||
weak = "weak"
|
||||
|
||||
|
||||
class RiskLevel(str, Enum):
|
||||
block = "block"
|
||||
high_ = "high+"
|
||||
high = "high"
|
||||
middle_ = "middle+"
|
||||
middle = "middle"
|
||||
|
||||
|
||||
class UpdateState(Enum):
|
||||
false = "false"
|
||||
true = "true"
|
||||
|
||||
|
||||
class ManagerPack(ManagerPackInfo):
|
||||
author: Optional[str] = Field(
|
||||
None, description="Pack author name or 'Unclaimed' if added via GitHub crawl"
|
||||
)
|
||||
files: Optional[List[str]] = Field(
|
||||
None,
|
||||
description="Repository URLs for installation (typically contains one GitHub URL)",
|
||||
)
|
||||
reference: Optional[str] = Field(
|
||||
None, description="The type of installation reference"
|
||||
)
|
||||
title: Optional[str] = Field(None, description="The display name of the pack")
|
||||
cnr_latest: Optional[SelectedVersion] = None
|
||||
repository: Optional[str] = Field(None, description="GitHub repository URL")
|
||||
state: Optional[ManagerPackState] = None
|
||||
update_state: Optional[UpdateState] = Field(
|
||||
None, alias="update-state", description="Update availability status"
|
||||
)
|
||||
stars: Optional[int] = Field(None, description="GitHub stars count")
|
||||
last_update: Optional[datetime] = Field(None, description="Last update timestamp")
|
||||
health: Optional[str] = Field(None, description="Health status of the pack")
|
||||
description: Optional[str] = Field(None, description="Pack description")
|
||||
trust: Optional[bool] = Field(None, description="Whether the pack is trusted")
|
||||
install_type: Optional[ManagerPackInstallType] = None
|
||||
|
||||
|
||||
class InstallPackParams(ManagerPackInfo):
|
||||
selected_version: Union[str, SelectedVersion] = Field(
|
||||
..., description="Semantic version, Git commit hash, latest, or nightly"
|
||||
)
|
||||
repository: Optional[str] = Field(
|
||||
None,
|
||||
description="GitHub repository URL (required if selected_version is nightly)",
|
||||
)
|
||||
pip: Optional[List[str]] = Field(None, description="PyPi dependency names")
|
||||
mode: ManagerDatabaseSource
|
||||
channel: ManagerChannel
|
||||
skip_post_install: Optional[bool] = Field(
|
||||
None, description="Whether to skip post-installation steps"
|
||||
)
|
||||
|
||||
|
||||
class UpdateAllPacksParams(BaseModel):
|
||||
mode: Optional[ManagerDatabaseSource] = None
|
||||
ui_id: Optional[str] = Field(None, description="Task ID - generated internally")
|
||||
|
||||
|
||||
class UpdatePackParams(BaseModel):
|
||||
node_name: str = Field(..., description="Name of the node package to update")
|
||||
node_ver: Optional[str] = Field(
|
||||
None, description="Current version of the node package"
|
||||
)
|
||||
|
||||
|
||||
class UpdateComfyUIParams(BaseModel):
|
||||
is_stable: Optional[bool] = Field(
|
||||
True,
|
||||
description="Whether to update to stable version (true) or nightly (false)",
|
||||
)
|
||||
target_version: Optional[str] = Field(
|
||||
None,
|
||||
description="Specific version to switch to (for version switching operations)",
|
||||
)
|
||||
|
||||
|
||||
class FixPackParams(BaseModel):
|
||||
node_name: str = Field(..., description="Name of the node package to fix")
|
||||
node_ver: str = Field(..., description="Version of the node package")
|
||||
|
||||
|
||||
class UninstallPackParams(BaseModel):
|
||||
node_name: str = Field(..., description="Name of the node package to uninstall")
|
||||
is_unknown: Optional[bool] = Field(
|
||||
False, description="Whether this is an unknown/unregistered package"
|
||||
)
|
||||
|
||||
|
||||
class DisablePackParams(BaseModel):
|
||||
node_name: str = Field(..., description="Name of the node package to disable")
|
||||
is_unknown: Optional[bool] = Field(
|
||||
False, description="Whether this is an unknown/unregistered package"
|
||||
)
|
||||
|
||||
|
||||
class EnablePackParams(BaseModel):
|
||||
cnr_id: str = Field(
|
||||
..., description="ComfyUI Node Registry ID of the package to enable"
|
||||
)
|
||||
|
||||
|
||||
class UpdateAllQueryParams(BaseModel):
|
||||
client_id: str = Field(
|
||||
..., description="Client identifier that initiated the request"
|
||||
)
|
||||
ui_id: str = Field(..., description="Base UI identifier for task tracking")
|
||||
mode: Optional[ManagerDatabaseSource] = None
|
||||
|
||||
|
||||
class UpdateComfyUIQueryParams(BaseModel):
|
||||
client_id: str = Field(
|
||||
..., description="Client identifier that initiated the request"
|
||||
)
|
||||
ui_id: str = Field(..., description="UI identifier for task tracking")
|
||||
stable: Optional[bool] = Field(
|
||||
True,
|
||||
description="Whether to update to stable version (true) or nightly (false)",
|
||||
)
|
||||
|
||||
|
||||
class ComfyUISwitchVersionQueryParams(BaseModel):
|
||||
ver: str = Field(..., description="Version to switch to")
|
||||
client_id: str = Field(
|
||||
..., description="Client identifier that initiated the request"
|
||||
)
|
||||
ui_id: str = Field(..., description="UI identifier for task tracking")
|
||||
|
||||
|
||||
class QueueStatus(BaseModel):
|
||||
total_count: int = Field(
|
||||
..., description="Total number of tasks (pending + running)"
|
||||
)
|
||||
done_count: int = Field(..., description="Number of completed tasks")
|
||||
in_progress_count: int = Field(..., description="Number of tasks currently running")
|
||||
pending_count: Optional[int] = Field(
|
||||
None, description="Number of tasks waiting to be executed"
|
||||
)
|
||||
is_processing: bool = Field(..., description="Whether the task worker is active")
|
||||
client_id: Optional[str] = Field(
|
||||
None, description="Client ID (when filtered by client)"
|
||||
)
|
||||
|
||||
|
||||
class ManagerMappings1(BaseModel):
|
||||
title_aux: Optional[str] = Field(None, description="The display name of the pack")
|
||||
|
||||
|
||||
class ManagerMappings(
|
||||
RootModel[Optional[Dict[str, List[Union[List[str], ManagerMappings1]]]]]
|
||||
):
|
||||
root: Optional[Dict[str, List[Union[List[str], ManagerMappings1]]]] = Field(
|
||||
None, description="Tuple of [node_names, metadata]"
|
||||
)
|
||||
|
||||
|
||||
class ModelMetadata(BaseModel):
|
||||
name: str = Field(..., description="Name of the model")
|
||||
type: str = Field(..., description="Type of model")
|
||||
base: Optional[str] = Field(None, description="Base model type")
|
||||
save_path: Optional[str] = Field(None, description="Path for saving the model")
|
||||
url: str = Field(..., description="Download URL")
|
||||
filename: str = Field(..., description="Target filename")
|
||||
ui_id: Optional[str] = Field(None, description="ID for UI reference")
|
||||
|
||||
|
||||
class InstallType(str, Enum):
|
||||
git = "git"
|
||||
copy = "copy"
|
||||
pip = "pip"
|
||||
|
||||
|
||||
class NodePackageMetadata(BaseModel):
|
||||
title: Optional[str] = Field(None, description="Display name of the node package")
|
||||
name: Optional[str] = Field(None, description="Repository/package name")
|
||||
files: Optional[List[str]] = Field(None, description="Source URLs for the package")
|
||||
description: Optional[str] = Field(
|
||||
None, description="Description of the node package functionality"
|
||||
)
|
||||
install_type: Optional[InstallType] = Field(None, description="Installation method")
|
||||
version: Optional[str] = Field(None, description="Version identifier")
|
||||
id: Optional[str] = Field(
|
||||
None, description="Unique identifier for the node package"
|
||||
)
|
||||
ui_id: Optional[str] = Field(None, description="ID for UI reference")
|
||||
channel: Optional[str] = Field(None, description="Source channel")
|
||||
mode: Optional[str] = Field(None, description="Source mode")
|
||||
|
||||
|
||||
class SnapshotItem(RootModel[str]):
|
||||
root: str = Field(..., description="Name of the snapshot")
|
||||
|
||||
|
||||
class Error(BaseModel):
|
||||
error: str = Field(..., description="Error message")
|
||||
|
||||
|
||||
class InstalledPacksResponse(RootModel[Optional[Dict[str, ManagerPackInstalled]]]):
|
||||
root: Optional[Dict[str, ManagerPackInstalled]] = None
|
||||
|
||||
|
||||
class HistoryListResponse(BaseModel):
|
||||
ids: Optional[List[str]] = Field(
|
||||
None, description="List of available batch history IDs"
|
||||
)
|
||||
|
||||
|
||||
class InstalledNodeInfo(BaseModel):
|
||||
name: str = Field(..., description="Node package name")
|
||||
version: str = Field(..., description="Installed version")
|
||||
repository_url: Optional[str] = Field(None, description="Git repository URL")
|
||||
install_method: str = Field(
|
||||
..., description="Installation method (cnr, git, pip, etc.)"
|
||||
)
|
||||
enabled: Optional[bool] = Field(
|
||||
True, description="Whether the node is currently enabled"
|
||||
)
|
||||
install_date: Optional[datetime] = Field(
|
||||
None, description="ISO timestamp of installation"
|
||||
)
|
||||
|
||||
|
||||
class InstalledModelInfo(BaseModel):
|
||||
name: str = Field(..., description="Model filename")
|
||||
path: str = Field(..., description="Full path to model file")
|
||||
type: str = Field(..., description="Model type (checkpoint, lora, vae, etc.)")
|
||||
size_bytes: Optional[int] = Field(None, description="File size in bytes", ge=0)
|
||||
hash: Optional[str] = Field(None, description="Model file hash for verification")
|
||||
install_date: Optional[datetime] = Field(
|
||||
None, description="ISO timestamp when added"
|
||||
)
|
||||
|
||||
|
||||
class ComfyUIVersionInfo(BaseModel):
|
||||
version: str = Field(..., description="ComfyUI version string")
|
||||
commit_hash: Optional[str] = Field(None, description="Git commit hash")
|
||||
branch: Optional[str] = Field(None, description="Git branch name")
|
||||
is_stable: Optional[bool] = Field(
|
||||
False, description="Whether this is a stable release"
|
||||
)
|
||||
last_updated: Optional[datetime] = Field(
|
||||
None, description="ISO timestamp of last update"
|
||||
)
|
||||
|
||||
|
||||
class BatchOperation(BaseModel):
|
||||
operation_id: str = Field(..., description="Unique operation identifier")
|
||||
operation_type: OperationType
|
||||
target: str = Field(
|
||||
..., description="Target of the operation (node name, model name, etc.)"
|
||||
)
|
||||
target_version: Optional[str] = Field(
|
||||
None, description="Target version for the operation"
|
||||
)
|
||||
result: OperationResult
|
||||
error_message: Optional[str] = Field(
|
||||
None, description="Error message if operation failed"
|
||||
)
|
||||
start_time: datetime = Field(
|
||||
..., description="ISO timestamp when operation started"
|
||||
)
|
||||
end_time: Optional[datetime] = Field(
|
||||
None, description="ISO timestamp when operation completed"
|
||||
)
|
||||
client_id: Optional[str] = Field(
|
||||
None, description="Client that initiated the operation"
|
||||
)
|
||||
|
||||
|
||||
class ComfyUISystemState(BaseModel):
|
||||
snapshot_time: datetime = Field(
|
||||
..., description="ISO timestamp when snapshot was taken"
|
||||
)
|
||||
comfyui_version: ComfyUIVersionInfo
|
||||
frontend_version: Optional[str] = Field(
|
||||
None, description="ComfyUI frontend version if available"
|
||||
)
|
||||
python_version: str = Field(..., description="Python interpreter version")
|
||||
platform_info: str = Field(
|
||||
..., description="Operating system and platform information"
|
||||
)
|
||||
installed_nodes: Optional[Dict[str, InstalledNodeInfo]] = Field(
|
||||
None, description="Map of installed node packages by name"
|
||||
)
|
||||
installed_models: Optional[Dict[str, InstalledModelInfo]] = Field(
|
||||
None, description="Map of installed models by name"
|
||||
)
|
||||
manager_config: Optional[Dict[str, Any]] = Field(
|
||||
None, description="ComfyUI Manager configuration settings"
|
||||
)
|
||||
comfyui_root_path: Optional[str] = Field(
|
||||
None, description="ComfyUI root installation directory"
|
||||
)
|
||||
model_paths: Optional[Dict[str, List[str]]] = Field(
|
||||
None, description="Map of model types to their configured paths"
|
||||
)
|
||||
manager_version: Optional[str] = Field(None, description="ComfyUI Manager version")
|
||||
security_level: Optional[SecurityLevel] = None
|
||||
network_mode: Optional[str] = Field(
|
||||
None, description="Network mode (online, offline, private)"
|
||||
)
|
||||
cli_args: Optional[Dict[str, Any]] = Field(
|
||||
None, description="Selected ComfyUI CLI arguments"
|
||||
)
|
||||
custom_nodes_count: Optional[int] = Field(
|
||||
None, description="Total number of custom node packages", ge=0
|
||||
)
|
||||
failed_imports: Optional[List[str]] = Field(
|
||||
None, description="List of custom nodes that failed to import"
|
||||
)
|
||||
pip_packages: Optional[Dict[str, str]] = Field(
|
||||
None, description="Map of installed pip packages to their versions"
|
||||
)
|
||||
embedded_python: Optional[bool] = Field(
|
||||
None,
|
||||
description="Whether ComfyUI is running from an embedded Python distribution",
|
||||
)
|
||||
|
||||
|
||||
class BatchExecutionRecord(BaseModel):
|
||||
batch_id: str = Field(..., description="Unique batch identifier")
|
||||
start_time: datetime = Field(..., description="ISO timestamp when batch started")
|
||||
end_time: Optional[datetime] = Field(
|
||||
None, description="ISO timestamp when batch completed"
|
||||
)
|
||||
state_before: ComfyUISystemState
|
||||
state_after: Optional[ComfyUISystemState] = Field(
|
||||
None, description="System state after batch execution"
|
||||
)
|
||||
operations: Optional[List[BatchOperation]] = Field(
|
||||
None, description="List of operations performed in this batch"
|
||||
)
|
||||
total_operations: Optional[int] = Field(
|
||||
0, description="Total number of operations in batch", ge=0
|
||||
)
|
||||
successful_operations: Optional[int] = Field(
|
||||
0, description="Number of successful operations", ge=0
|
||||
)
|
||||
failed_operations: Optional[int] = Field(
|
||||
0, description="Number of failed operations", ge=0
|
||||
)
|
||||
skipped_operations: Optional[int] = Field(
|
||||
0, description="Number of skipped operations", ge=0
|
||||
)
|
||||
|
||||
|
||||
class ImportFailInfoBulkRequest(BaseModel):
|
||||
cnr_ids: Optional[List[str]] = Field(
|
||||
None, description="A list of CNR IDs to check."
|
||||
)
|
||||
urls: Optional[List[str]] = Field(
|
||||
None, description="A list of repository URLs to check."
|
||||
)
|
||||
|
||||
|
||||
class ImportFailInfoItem1(BaseModel):
|
||||
error: Optional[str] = None
|
||||
traceback: Optional[str] = None
|
||||
|
||||
|
||||
class ImportFailInfoItem(RootModel[Optional[ImportFailInfoItem1]]):
|
||||
root: Optional[ImportFailInfoItem1]
|
||||
|
||||
|
||||
class QueueTaskItem(BaseModel):
|
||||
ui_id: str = Field(..., description="Unique identifier for the task")
|
||||
client_id: str = Field(..., description="Client identifier that initiated the task")
|
||||
kind: OperationType
|
||||
params: Union[
|
||||
InstallPackParams,
|
||||
UpdatePackParams,
|
||||
UpdateAllPacksParams,
|
||||
UpdateComfyUIParams,
|
||||
FixPackParams,
|
||||
UninstallPackParams,
|
||||
DisablePackParams,
|
||||
EnablePackParams,
|
||||
ModelMetadata,
|
||||
]
|
||||
|
||||
|
||||
class TaskHistoryItem(BaseModel):
|
||||
ui_id: str = Field(..., description="Unique identifier for the task")
|
||||
client_id: str = Field(..., description="Client identifier that initiated the task")
|
||||
kind: str = Field(..., description="Type of task that was performed")
|
||||
timestamp: datetime = Field(..., description="ISO timestamp when task completed")
|
||||
result: str = Field(..., description="Task result message or details")
|
||||
status: Optional[TaskExecutionStatus] = None
|
||||
batch_id: Optional[str] = Field(
|
||||
None, description="ID of the batch this task belongs to"
|
||||
)
|
||||
end_time: Optional[datetime] = Field(
|
||||
None, description="ISO timestamp when task execution ended"
|
||||
)
|
||||
|
||||
|
||||
class TaskStateMessage(BaseModel):
|
||||
history: Dict[str, TaskHistoryItem] = Field(
|
||||
..., description="Map of task IDs to their history items"
|
||||
)
|
||||
running_queue: List[QueueTaskItem] = Field(
|
||||
..., description="Currently executing tasks"
|
||||
)
|
||||
pending_queue: List[QueueTaskItem] = Field(
|
||||
..., description="Tasks waiting to be executed"
|
||||
)
|
||||
installed_packs: Dict[str, ManagerPackInstalled] = Field(
|
||||
..., description="Map of currently installed node packages by name"
|
||||
)
|
||||
|
||||
|
||||
class MessageTaskDone(BaseModel):
|
||||
ui_id: str = Field(..., description="Task identifier")
|
||||
result: str = Field(..., description="Task result message")
|
||||
kind: str = Field(..., description="Type of task")
|
||||
status: Optional[TaskExecutionStatus] = None
|
||||
timestamp: datetime = Field(..., description="ISO timestamp when task completed")
|
||||
state: TaskStateMessage
|
||||
|
||||
|
||||
class MessageTaskStarted(BaseModel):
|
||||
ui_id: str = Field(..., description="Task identifier")
|
||||
kind: str = Field(..., description="Type of task")
|
||||
timestamp: datetime = Field(..., description="ISO timestamp when task started")
|
||||
state: TaskStateMessage
|
||||
|
||||
|
||||
class MessageTaskFailed(BaseModel):
|
||||
ui_id: str = Field(..., description="Task identifier")
|
||||
error: str = Field(..., description="Error message")
|
||||
kind: str = Field(..., description="Type of task")
|
||||
timestamp: datetime = Field(..., description="ISO timestamp when task failed")
|
||||
state: TaskStateMessage
|
||||
|
||||
|
||||
class MessageUpdate(
|
||||
RootModel[Union[MessageTaskDone, MessageTaskStarted, MessageTaskFailed]]
|
||||
):
|
||||
root: Union[MessageTaskDone, MessageTaskStarted, MessageTaskFailed] = Field(
|
||||
..., description="Union type for all possible WebSocket message updates"
|
||||
)
|
||||
|
||||
|
||||
class HistoryResponse(BaseModel):
|
||||
history: Optional[Dict[str, TaskHistoryItem]] = Field(
|
||||
None, description="Map of task IDs to their history items"
|
||||
)
|
||||
|
||||
|
||||
class ImportFailInfoBulkResponse(RootModel[Optional[Dict[str, ImportFailInfoItem]]]):
|
||||
root: Optional[Dict[str, ImportFailInfoItem]] = None
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
11
comfyui_manager/glob/CLAUDE.md
Normal file
11
comfyui_manager/glob/CLAUDE.md
Normal file
@@ -0,0 +1,11 @@
|
||||
- Anytime you make a change to the data being sent or received, you should follow this process:
|
||||
1. Adjust the openapi.yaml file first
|
||||
2. Verify the syntax of the openapi.yaml file using `yaml.safe_load`
|
||||
3. Regenerate the types following the instructions in the `data_models/README.md` file
|
||||
4. Verify the new data model is generated
|
||||
5. Verify the syntax of the generated types files
|
||||
6. Run formatting and linting on the generated types files
|
||||
7. Adjust the `__init__.py` files in the `data_models` directory to match/export the new data model
|
||||
8. Only then, make the changes to the rest of the codebase
|
||||
9. Run the CI tests to verify that the changes are working
|
||||
- The comfyui_manager is a python package that is used to manage the comfyui server. There are two sub-packages `glob` and `legacy`. These represent the current version (`glob`) and the previous version (`legacy`), not including common utilities and data models. When developing, we work in the `glob` package. You can ignore the `legacy` package entirely, unless you have a very good reason to research how things were done in the legacy or prior major versions of the package. But in those cases, you should just look for the sake of knowledge or reflection, not for changing code (unless explicitly asked to do so).
|
||||
0
comfyui_manager/glob/__init__.py
Normal file
0
comfyui_manager/glob/__init__.py
Normal file
55
comfyui_manager/glob/constants.py
Normal file
55
comfyui_manager/glob/constants.py
Normal file
@@ -0,0 +1,55 @@
|
||||
|
||||
SECURITY_MESSAGE_MIDDLE = "ERROR: To use this action, a security_level of `normal or below` is required. Please contact the administrator.\nReference: https://github.com/ltdrdata/ComfyUI-Manager#security-policy"
|
||||
SECURITY_MESSAGE_MIDDLE_P = "ERROR: To use this action, security_level must be `normal or below`, and network_mode must be set to `personal_cloud`. Please contact the administrator.\nReference: https://github.com/ltdrdata/ComfyUI-Manager#security-policy"
|
||||
SECURITY_MESSAGE_NORMAL_MINUS = "ERROR: To use this feature, you must either set '--listen' to a local IP and set the security level to 'normal-' or lower, or set the security level to 'middle' or 'weak'. Please contact the administrator.\nReference: https://github.com/ltdrdata/ComfyUI-Manager#security-policy"
|
||||
SECURITY_MESSAGE_GENERAL = "ERROR: This installation is not allowed in this security_level. Please contact the administrator.\nReference: https://github.com/ltdrdata/ComfyUI-Manager#security-policy"
|
||||
SECURITY_MESSAGE_NORMAL_MINUS_MODEL = "ERROR: Downloading models that are not in '.safetensors' format is only allowed for models registered in the 'default' channel at this security level. If you want to download this model, set the security level to 'normal-' or lower."
|
||||
|
||||
|
||||
def is_loopback(address):
|
||||
import ipaddress
|
||||
|
||||
try:
|
||||
return ipaddress.ip_address(address).is_loopback
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
model_dir_name_map = {
|
||||
"checkpoints": "checkpoints",
|
||||
"checkpoint": "checkpoints",
|
||||
"unclip": "checkpoints",
|
||||
"text_encoders": "text_encoders",
|
||||
"clip": "text_encoders",
|
||||
"vae": "vae",
|
||||
"lora": "loras",
|
||||
"t2i-adapter": "controlnet",
|
||||
"t2i-style": "controlnet",
|
||||
"controlnet": "controlnet",
|
||||
"clip_vision": "clip_vision",
|
||||
"gligen": "gligen",
|
||||
"upscale": "upscale_models",
|
||||
"embedding": "embeddings",
|
||||
"embeddings": "embeddings",
|
||||
"unet": "diffusion_models",
|
||||
"diffusion_model": "diffusion_models",
|
||||
}
|
||||
|
||||
# List of all model directory names used for checking installed models
|
||||
MODEL_DIR_NAMES = [
|
||||
"checkpoints",
|
||||
"loras",
|
||||
"vae",
|
||||
"text_encoders",
|
||||
"diffusion_models",
|
||||
"clip_vision",
|
||||
"embeddings",
|
||||
"diffusers",
|
||||
"vae_approx",
|
||||
"controlnet",
|
||||
"gligen",
|
||||
"upscale_models",
|
||||
"hypernetworks",
|
||||
"photomaker",
|
||||
"classifiers",
|
||||
]
|
||||
3332
comfyui_manager/glob/manager_core.py
Normal file
3332
comfyui_manager/glob/manager_core.py
Normal file
File diff suppressed because it is too large
Load Diff
2085
comfyui_manager/glob/manager_server.py
Normal file
2085
comfyui_manager/glob/manager_server.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,7 @@
|
||||
import mimetypes
|
||||
import manager_core as core
|
||||
from ..common import context
|
||||
from . import manager_core as core
|
||||
|
||||
import os
|
||||
from aiohttp import web
|
||||
import aiohttp
|
||||
@@ -8,6 +10,16 @@ import hashlib
|
||||
|
||||
import folder_paths
|
||||
from server import PromptServer
|
||||
import logging
|
||||
import sys
|
||||
|
||||
|
||||
try:
|
||||
from nio import AsyncClient, LoginResponse, UploadResponse
|
||||
matrix_nio_is_available = True
|
||||
except Exception:
|
||||
logging.warning(f"[ComfyUI-Manager] The matrix sharing feature has been disabled because the `matrix-nio` dependency is not installed.\n\tTo use this feature, please run the following command:\n\t{sys.executable} -m pip install matrix-nio\n")
|
||||
matrix_nio_is_available = False
|
||||
|
||||
|
||||
def extract_model_file_names(json_data):
|
||||
@@ -53,7 +65,7 @@ def compute_sha256_checksum(filepath):
|
||||
return sha256.hexdigest()
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/share_option")
|
||||
@PromptServer.instance.routes.get("/v2/manager/share_option")
|
||||
async def share_option(request):
|
||||
if "value" in request.rel_url.query:
|
||||
core.get_config()['share_option'] = request.rel_url.query['value']
|
||||
@@ -65,21 +77,21 @@ async def share_option(request):
|
||||
|
||||
|
||||
def get_openart_auth():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, ".openart_key")):
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, ".openart_key")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, ".openart_key"), "r") as f:
|
||||
with open(os.path.join(context.manager_files_path, ".openart_key"), "r") as f:
|
||||
openart_key = f.read().strip()
|
||||
return openart_key if openart_key else None
|
||||
except:
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_matrix_auth():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, "matrix_auth")):
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, "matrix_auth")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, "matrix_auth"), "r") as f:
|
||||
with open(os.path.join(context.manager_files_path, "matrix_auth"), "r") as f:
|
||||
matrix_auth = f.read()
|
||||
homeserver, username, password = matrix_auth.strip().split("\n")
|
||||
if not homeserver or not username or not password:
|
||||
@@ -89,40 +101,40 @@ def get_matrix_auth():
|
||||
"username": username,
|
||||
"password": password,
|
||||
}
|
||||
except:
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_comfyworkflows_auth():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, "comfyworkflows_sharekey")):
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, "comfyworkflows_sharekey")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, "comfyworkflows_sharekey"), "r") as f:
|
||||
with open(os.path.join(context.manager_files_path, "comfyworkflows_sharekey"), "r") as f:
|
||||
share_key = f.read()
|
||||
if not share_key.strip():
|
||||
return None
|
||||
return share_key
|
||||
except:
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_youml_settings():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, ".youml")):
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, ".youml")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, ".youml"), "r") as f:
|
||||
with open(os.path.join(context.manager_files_path, ".youml"), "r") as f:
|
||||
youml_settings = f.read().strip()
|
||||
return youml_settings if youml_settings else None
|
||||
except:
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def set_youml_settings(settings):
|
||||
with open(os.path.join(core.manager_files_path, ".youml"), "w") as f:
|
||||
with open(os.path.join(context.manager_files_path, ".youml"), "w") as f:
|
||||
f.write(settings)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_openart_auth")
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_openart_auth")
|
||||
async def api_get_openart_auth(request):
|
||||
# print("Getting stored Matrix credentials...")
|
||||
openart_key = get_openart_auth()
|
||||
@@ -131,16 +143,16 @@ async def api_get_openart_auth(request):
|
||||
return web.json_response({"openart_key": openart_key})
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/set_openart_auth")
|
||||
@PromptServer.instance.routes.post("/v2/manager/set_openart_auth")
|
||||
async def api_set_openart_auth(request):
|
||||
json_data = await request.json()
|
||||
openart_key = json_data['openart_key']
|
||||
with open(os.path.join(core.manager_files_path, ".openart_key"), "w") as f:
|
||||
with open(os.path.join(context.manager_files_path, ".openart_key"), "w") as f:
|
||||
f.write(openart_key)
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_matrix_auth")
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_matrix_auth")
|
||||
async def api_get_matrix_auth(request):
|
||||
# print("Getting stored Matrix credentials...")
|
||||
matrix_auth = get_matrix_auth()
|
||||
@@ -149,7 +161,7 @@ async def api_get_matrix_auth(request):
|
||||
return web.json_response(matrix_auth)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/youml/settings")
|
||||
@PromptServer.instance.routes.get("/v2/manager/youml/settings")
|
||||
async def api_get_youml_settings(request):
|
||||
youml_settings = get_youml_settings()
|
||||
if not youml_settings:
|
||||
@@ -157,14 +169,14 @@ async def api_get_youml_settings(request):
|
||||
return web.json_response(json.loads(youml_settings))
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/youml/settings")
|
||||
@PromptServer.instance.routes.post("/v2/manager/youml/settings")
|
||||
async def api_set_youml_settings(request):
|
||||
json_data = await request.json()
|
||||
set_youml_settings(json.dumps(json_data))
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_comfyworkflows_auth")
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_comfyworkflows_auth")
|
||||
async def api_get_comfyworkflows_auth(request):
|
||||
# Check if the user has provided Matrix credentials in a file called 'matrix_accesstoken'
|
||||
# in the same directory as the ComfyUI base folder
|
||||
@@ -175,31 +187,39 @@ async def api_get_comfyworkflows_auth(request):
|
||||
return web.json_response({"comfyworkflows_sharekey": comfyworkflows_auth})
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/set_esheep_workflow_and_images")
|
||||
@PromptServer.instance.routes.post("/v2/manager/set_esheep_workflow_and_images")
|
||||
async def set_esheep_workflow_and_images(request):
|
||||
json_data = await request.json()
|
||||
with open(os.path.join(core.manager_files_path, "esheep_share_message.json"), "w", encoding='utf-8') as file:
|
||||
with open(os.path.join(context.manager_files_path, "esheep_share_message.json"), "w", encoding='utf-8') as file:
|
||||
json.dump(json_data, file, indent=4)
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_esheep_workflow_and_images")
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_esheep_workflow_and_images")
|
||||
async def get_esheep_workflow_and_images(request):
|
||||
with open(os.path.join(core.manager_files_path, "esheep_share_message.json"), 'r', encoding='utf-8') as file:
|
||||
with open(os.path.join(context.manager_files_path, "esheep_share_message.json"), 'r', encoding='utf-8') as file:
|
||||
data = json.load(file)
|
||||
return web.Response(status=200, text=json.dumps(data))
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_matrix_dep_status")
|
||||
async def get_matrix_dep_status(request):
|
||||
if matrix_nio_is_available:
|
||||
return web.Response(status=200, text='available')
|
||||
else:
|
||||
return web.Response(status=200, text='unavailable')
|
||||
|
||||
|
||||
def set_matrix_auth(json_data):
|
||||
homeserver = json_data['homeserver']
|
||||
username = json_data['username']
|
||||
password = json_data['password']
|
||||
with open(os.path.join(core.manager_files_path, "matrix_auth"), "w") as f:
|
||||
with open(os.path.join(context.manager_files_path, "matrix_auth"), "w") as f:
|
||||
f.write("\n".join([homeserver, username, password]))
|
||||
|
||||
|
||||
def set_comfyworkflows_auth(comfyworkflows_sharekey):
|
||||
with open(os.path.join(core.manager_files_path, "comfyworkflows_sharekey"), "w") as f:
|
||||
with open(os.path.join(context.manager_files_path, "comfyworkflows_sharekey"), "w") as f:
|
||||
f.write(comfyworkflows_sharekey)
|
||||
|
||||
|
||||
@@ -211,7 +231,7 @@ def has_provided_comfyworkflows_auth(comfyworkflows_sharekey):
|
||||
return comfyworkflows_sharekey.strip()
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/share")
|
||||
@PromptServer.instance.routes.post("/v2/manager/share")
|
||||
async def share_art(request):
|
||||
# get json data
|
||||
json_data = await request.json()
|
||||
@@ -233,7 +253,7 @@ async def share_art(request):
|
||||
|
||||
try:
|
||||
output_to_share = potential_outputs[int(selected_output_index)]
|
||||
except:
|
||||
except Exception:
|
||||
# for now, pick the first output
|
||||
output_to_share = potential_outputs[0]
|
||||
|
||||
@@ -329,15 +349,12 @@ async def share_art(request):
|
||||
workflowId = upload_workflow_json["workflowId"]
|
||||
|
||||
# check if the user has provided Matrix credentials
|
||||
if "matrix" in share_destinations:
|
||||
if matrix_nio_is_available and "matrix" in share_destinations:
|
||||
comfyui_share_room_id = '!LGYSoacpJPhIfBqVfb:matrix.org'
|
||||
filename = os.path.basename(asset_filepath)
|
||||
content_type = assetFileType
|
||||
|
||||
try:
|
||||
from matrix_client.api import MatrixHttpApi
|
||||
from matrix_client.client import MatrixClient
|
||||
|
||||
homeserver = 'matrix.org'
|
||||
if matrix_auth:
|
||||
homeserver = matrix_auth.get('homeserver', 'matrix.org')
|
||||
@@ -345,20 +362,35 @@ async def share_art(request):
|
||||
if not homeserver.startswith("https://"):
|
||||
homeserver = "https://" + homeserver
|
||||
|
||||
client = MatrixClient(homeserver)
|
||||
try:
|
||||
token = client.login(username=matrix_auth['username'], password=matrix_auth['password'])
|
||||
if not token:
|
||||
return web.json_response({"error": "Invalid Matrix credentials."}, content_type='application/json', status=400)
|
||||
except:
|
||||
client = AsyncClient(homeserver, matrix_auth['username'])
|
||||
|
||||
# Login
|
||||
login_resp = await client.login(matrix_auth['password'])
|
||||
if not isinstance(login_resp, LoginResponse) or not login_resp.access_token:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Invalid Matrix credentials."}, content_type='application/json', status=400)
|
||||
|
||||
matrix = MatrixHttpApi(homeserver, token=token)
|
||||
# Upload asset
|
||||
with open(asset_filepath, 'rb') as f:
|
||||
mxc_url = matrix.media_upload(f.read(), content_type, filename=filename)['content_uri']
|
||||
upload_resp, _maybe_keys = await client.upload(f, content_type=content_type, filename=filename)
|
||||
asset_data = f.seek(0) or f.read() # get size for info below
|
||||
if not isinstance(upload_resp, UploadResponse) or not upload_resp.content_uri:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Failed to upload asset to Matrix."}, content_type='application/json', status=500)
|
||||
mxc_url = upload_resp.content_uri
|
||||
|
||||
workflow_json_mxc_url = matrix.media_upload(prompt['workflow'], 'application/json', filename='workflow.json')['content_uri']
|
||||
# Upload workflow JSON
|
||||
import io
|
||||
workflow_json_bytes = json.dumps(prompt['workflow']).encode('utf-8')
|
||||
workflow_io = io.BytesIO(workflow_json_bytes)
|
||||
upload_workflow_resp, _maybe_keys = await client.upload(workflow_io, content_type='application/json', filename='workflow.json')
|
||||
workflow_io.seek(0)
|
||||
if not isinstance(upload_workflow_resp, UploadResponse) or not upload_workflow_resp.content_uri:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Failed to upload workflow to Matrix."}, content_type='application/json', status=500)
|
||||
workflow_json_mxc_url = upload_workflow_resp.content_uri
|
||||
|
||||
# Send text message
|
||||
text_content = ""
|
||||
if title:
|
||||
text_content += f"{title}\n"
|
||||
@@ -366,9 +398,44 @@ async def share_art(request):
|
||||
text_content += f"{description}\n"
|
||||
if credits:
|
||||
text_content += f"\ncredits: {credits}\n"
|
||||
matrix.send_message(comfyui_share_room_id, text_content)
|
||||
matrix.send_content(comfyui_share_room_id, mxc_url, filename, 'm.image')
|
||||
matrix.send_content(comfyui_share_room_id, workflow_json_mxc_url, 'workflow.json', 'm.file')
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={"msgtype": "m.text", "body": text_content}
|
||||
)
|
||||
|
||||
# Send image
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={
|
||||
"msgtype": "m.image",
|
||||
"body": filename,
|
||||
"url": mxc_url,
|
||||
"info": {
|
||||
"mimetype": content_type,
|
||||
"size": len(asset_data)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Send workflow JSON file
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={
|
||||
"msgtype": "m.file",
|
||||
"body": "workflow.json",
|
||||
"url": workflow_json_mxc_url,
|
||||
"info": {
|
||||
"mimetype": "application/json",
|
||||
"size": len(workflow_json_bytes)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
await client.close()
|
||||
|
||||
except:
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
0
comfyui_manager/glob/utils/__init__.py
Normal file
0
comfyui_manager/glob/utils/__init__.py
Normal file
142
comfyui_manager/glob/utils/environment_utils.py
Normal file
142
comfyui_manager/glob/utils/environment_utils.py
Normal file
@@ -0,0 +1,142 @@
|
||||
import os
|
||||
import git
|
||||
import logging
|
||||
import traceback
|
||||
|
||||
from comfyui_manager.common import context
|
||||
import folder_paths
|
||||
from comfy.cli_args import args
|
||||
import latent_preview
|
||||
|
||||
from comfyui_manager.glob import manager_core as core
|
||||
from comfyui_manager.common import cm_global
|
||||
|
||||
|
||||
comfy_ui_hash = "-"
|
||||
comfyui_tag = None
|
||||
|
||||
|
||||
def print_comfyui_version():
|
||||
global comfy_ui_hash
|
||||
global comfyui_tag
|
||||
|
||||
is_detached = False
|
||||
try:
|
||||
repo = git.Repo(os.path.dirname(folder_paths.__file__))
|
||||
core.comfy_ui_revision = len(list(repo.iter_commits("HEAD")))
|
||||
|
||||
comfy_ui_hash = repo.head.commit.hexsha
|
||||
cm_global.variables["comfyui.revision"] = core.comfy_ui_revision
|
||||
|
||||
core.comfy_ui_commit_datetime = repo.head.commit.committed_datetime
|
||||
cm_global.variables["comfyui.commit_datetime"] = core.comfy_ui_commit_datetime
|
||||
|
||||
is_detached = repo.head.is_detached
|
||||
current_branch = repo.active_branch.name
|
||||
|
||||
comfyui_tag = context.get_comfyui_tag()
|
||||
|
||||
try:
|
||||
if (
|
||||
not os.environ.get("__COMFYUI_DESKTOP_VERSION__")
|
||||
and core.comfy_ui_commit_datetime.date()
|
||||
< core.comfy_ui_required_commit_datetime.date()
|
||||
):
|
||||
logging.warning(
|
||||
f"\n\n## [WARN] ComfyUI-Manager: Your ComfyUI version ({core.comfy_ui_revision})[{core.comfy_ui_commit_datetime.date()}] is too old. Please update to the latest version. ##\n\n"
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# process on_revision_detected -->
|
||||
if "cm.on_revision_detected_handler" in cm_global.variables:
|
||||
for k, f in cm_global.variables["cm.on_revision_detected_handler"]:
|
||||
try:
|
||||
f(core.comfy_ui_revision)
|
||||
except Exception:
|
||||
logging.error(f"[ERROR] '{k}' on_revision_detected_handler")
|
||||
traceback.print_exc()
|
||||
|
||||
del cm_global.variables["cm.on_revision_detected_handler"]
|
||||
else:
|
||||
logging.warning(
|
||||
"[ComfyUI-Manager] Some features are restricted due to your ComfyUI being outdated."
|
||||
)
|
||||
# <--
|
||||
|
||||
if current_branch == "master":
|
||||
if comfyui_tag:
|
||||
logging.info(
|
||||
f"### ComfyUI Version: {comfyui_tag} | Released on '{core.comfy_ui_commit_datetime.date()}'"
|
||||
)
|
||||
else:
|
||||
logging.info(
|
||||
f"### ComfyUI Revision: {core.comfy_ui_revision} [{comfy_ui_hash[:8]}] | Released on '{core.comfy_ui_commit_datetime.date()}'"
|
||||
)
|
||||
else:
|
||||
if comfyui_tag:
|
||||
logging.info(
|
||||
f"### ComfyUI Version: {comfyui_tag} on '{current_branch}' | Released on '{core.comfy_ui_commit_datetime.date()}'"
|
||||
)
|
||||
else:
|
||||
logging.info(
|
||||
f"### ComfyUI Revision: {core.comfy_ui_revision} on '{current_branch}' [{comfy_ui_hash[:8]}] | Released on '{core.comfy_ui_commit_datetime.date()}'"
|
||||
)
|
||||
except Exception:
|
||||
if is_detached:
|
||||
logging.info(
|
||||
f"### ComfyUI Revision: {core.comfy_ui_revision} [{comfy_ui_hash[:8]}] *DETACHED | Released on '{core.comfy_ui_commit_datetime.date()}'"
|
||||
)
|
||||
else:
|
||||
logging.info(
|
||||
"### ComfyUI Revision: UNKNOWN (The currently installed ComfyUI is not a Git repository)"
|
||||
)
|
||||
|
||||
|
||||
def set_preview_method(method):
|
||||
if method == "auto":
|
||||
args.preview_method = latent_preview.LatentPreviewMethod.Auto
|
||||
elif method == "latent2rgb":
|
||||
args.preview_method = latent_preview.LatentPreviewMethod.Latent2RGB
|
||||
elif method == "taesd":
|
||||
args.preview_method = latent_preview.LatentPreviewMethod.TAESD
|
||||
else:
|
||||
args.preview_method = latent_preview.LatentPreviewMethod.NoPreviews
|
||||
|
||||
core.get_config()["preview_method"] = method
|
||||
|
||||
|
||||
def set_update_policy(mode):
|
||||
core.get_config()["update_policy"] = mode
|
||||
|
||||
|
||||
def set_db_mode(mode):
|
||||
core.get_config()["db_mode"] = mode
|
||||
|
||||
|
||||
def setup_environment():
|
||||
git_exe = core.get_config()["git_exe"]
|
||||
|
||||
if git_exe != "":
|
||||
git.Git().update_environment(GIT_PYTHON_GIT_EXECUTABLE=git_exe)
|
||||
|
||||
|
||||
def initialize_environment():
|
||||
context.comfy_path = os.path.dirname(folder_paths.__file__)
|
||||
core.js_path = os.path.join(context.comfy_path, "web", "extensions")
|
||||
|
||||
# Legacy database paths - kept for potential future use
|
||||
# local_db_model = os.path.join(manager_util.comfyui_manager_path, "model-list.json")
|
||||
# local_db_alter = os.path.join(manager_util.comfyui_manager_path, "alter-list.json")
|
||||
# local_db_custom_node_list = os.path.join(
|
||||
# manager_util.comfyui_manager_path, "custom-node-list.json"
|
||||
# )
|
||||
# local_db_extension_node_mappings = os.path.join(
|
||||
# manager_util.comfyui_manager_path, "extension-node-map.json"
|
||||
# )
|
||||
|
||||
set_preview_method(core.get_config()["preview_method"])
|
||||
print_comfyui_version()
|
||||
setup_environment()
|
||||
|
||||
core.check_invalid_nodes()
|
||||
60
comfyui_manager/glob/utils/formatting_utils.py
Normal file
60
comfyui_manager/glob/utils/formatting_utils.py
Normal file
@@ -0,0 +1,60 @@
|
||||
import locale
|
||||
import sys
|
||||
import re
|
||||
|
||||
|
||||
def handle_stream(stream, prefix):
|
||||
stream.reconfigure(encoding=locale.getpreferredencoding(), errors="replace")
|
||||
for msg in stream:
|
||||
if (
|
||||
prefix == "[!]"
|
||||
and ("it/s]" in msg or "s/it]" in msg)
|
||||
and ("%|" in msg or "it [" in msg)
|
||||
):
|
||||
if msg.startswith("100%"):
|
||||
print("\r" + msg, end="", file=sys.stderr),
|
||||
else:
|
||||
print("\r" + msg[:-1], end="", file=sys.stderr),
|
||||
else:
|
||||
if prefix == "[!]":
|
||||
print(prefix, msg, end="", file=sys.stderr)
|
||||
else:
|
||||
print(prefix, msg, end="")
|
||||
|
||||
|
||||
def convert_markdown_to_html(input_text):
|
||||
pattern_a = re.compile(r"\[a/([^]]+)]\(([^)]+)\)")
|
||||
pattern_w = re.compile(r"\[w/([^]]+)]")
|
||||
pattern_i = re.compile(r"\[i/([^]]+)]")
|
||||
pattern_bold = re.compile(r"\*\*([^*]+)\*\*")
|
||||
pattern_white = re.compile(r"%%([^*]+)%%")
|
||||
|
||||
def replace_a(match):
|
||||
return f"<a href='{match.group(2)}' target='blank'>{match.group(1)}</a>"
|
||||
|
||||
def replace_w(match):
|
||||
return f"<p class='cm-warn-note'>{match.group(1)}</p>"
|
||||
|
||||
def replace_i(match):
|
||||
return f"<p class='cm-info-note'>{match.group(1)}</p>"
|
||||
|
||||
def replace_bold(match):
|
||||
return f"<B>{match.group(1)}</B>"
|
||||
|
||||
def replace_white(match):
|
||||
return f"<font color='white'>{match.group(1)}</font>"
|
||||
|
||||
input_text = (
|
||||
input_text.replace("\\[", "[")
|
||||
.replace("\\]", "]")
|
||||
.replace("<", "<")
|
||||
.replace(">", ">")
|
||||
)
|
||||
|
||||
result_text = re.sub(pattern_a, replace_a, input_text)
|
||||
result_text = re.sub(pattern_w, replace_w, result_text)
|
||||
result_text = re.sub(pattern_i, replace_i, result_text)
|
||||
result_text = re.sub(pattern_bold, replace_bold, result_text)
|
||||
result_text = re.sub(pattern_white, replace_white, result_text)
|
||||
|
||||
return result_text.replace("\n", "<BR>")
|
||||
161
comfyui_manager/glob/utils/model_utils.py
Normal file
161
comfyui_manager/glob/utils/model_utils.py
Normal file
@@ -0,0 +1,161 @@
|
||||
import os
|
||||
import logging
|
||||
import concurrent.futures
|
||||
import folder_paths
|
||||
|
||||
from comfyui_manager.glob import manager_core as core
|
||||
from comfyui_manager.glob.constants import model_dir_name_map, MODEL_DIR_NAMES
|
||||
|
||||
|
||||
def get_model_dir(data, show_log=False):
|
||||
if "download_model_base" in folder_paths.folder_names_and_paths:
|
||||
models_base = folder_paths.folder_names_and_paths["download_model_base"][0][0]
|
||||
else:
|
||||
models_base = folder_paths.models_dir
|
||||
|
||||
# NOTE: Validate to prevent path traversal.
|
||||
if any(char in data["filename"] for char in {"/", "\\", ":"}):
|
||||
return None
|
||||
|
||||
def resolve_custom_node(save_path):
|
||||
save_path = save_path[13:] # remove 'custom_nodes/'
|
||||
|
||||
# NOTE: Validate to prevent path traversal.
|
||||
if save_path.startswith(os.path.sep) or ":" in save_path:
|
||||
return None
|
||||
|
||||
repo_name = save_path.replace("\\", "/").split("/")[
|
||||
0
|
||||
] # get custom node repo name
|
||||
|
||||
# NOTE: The creation of files within the custom node path should be removed in the future.
|
||||
repo_path = core.lookup_installed_custom_nodes_legacy(repo_name)
|
||||
if repo_path is not None and repo_path[0]:
|
||||
# Returns the retargeted path based on the actually installed repository
|
||||
return os.path.join(os.path.dirname(repo_path[1]), save_path)
|
||||
else:
|
||||
return None
|
||||
|
||||
if data["save_path"] != "default":
|
||||
if ".." in data["save_path"] or data["save_path"].startswith("/"):
|
||||
if show_log:
|
||||
logging.info(
|
||||
f"[WARN] '{data['save_path']}' is not allowed path. So it will be saved into 'models/etc'."
|
||||
)
|
||||
base_model = os.path.join(models_base, "etc")
|
||||
else:
|
||||
if data["save_path"].startswith("custom_nodes"):
|
||||
base_model = resolve_custom_node(data["save_path"])
|
||||
if base_model is None:
|
||||
if show_log:
|
||||
logging.info(
|
||||
f"[ComfyUI-Manager] The target custom node for model download is not installed: {data['save_path']}"
|
||||
)
|
||||
return None
|
||||
else:
|
||||
base_model = os.path.join(models_base, data["save_path"])
|
||||
else:
|
||||
model_dir_name = model_dir_name_map.get(data["type"].lower())
|
||||
if model_dir_name is not None:
|
||||
base_model = folder_paths.folder_names_and_paths[model_dir_name][0][0]
|
||||
else:
|
||||
base_model = os.path.join(models_base, "etc")
|
||||
|
||||
return base_model
|
||||
|
||||
|
||||
def get_model_path(data, show_log=False):
|
||||
base_model = get_model_dir(data, show_log)
|
||||
if base_model is None:
|
||||
return None
|
||||
else:
|
||||
if data["filename"] == "<huggingface>":
|
||||
return os.path.join(base_model, os.path.basename(data["url"]))
|
||||
else:
|
||||
return os.path.join(base_model, data["filename"])
|
||||
|
||||
|
||||
def check_model_installed(json_obj):
|
||||
def is_exists(model_dir_name, filename, url):
|
||||
if filename == "<huggingface>":
|
||||
filename = os.path.basename(url)
|
||||
|
||||
dirs = folder_paths.get_folder_paths(model_dir_name)
|
||||
|
||||
for x in dirs:
|
||||
if os.path.exists(os.path.join(x, filename)):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
total_models_files = set()
|
||||
for x in MODEL_DIR_NAMES:
|
||||
for y in folder_paths.get_filename_list(x):
|
||||
total_models_files.add(y)
|
||||
|
||||
def process_model_phase(item):
|
||||
if (
|
||||
"diffusion" not in item["filename"]
|
||||
and "pytorch" not in item["filename"]
|
||||
and "model" not in item["filename"]
|
||||
):
|
||||
# non-general name case
|
||||
if item["filename"] in total_models_files:
|
||||
item["installed"] = "True"
|
||||
return
|
||||
|
||||
if item["save_path"] == "default":
|
||||
model_dir_name = model_dir_name_map.get(item["type"].lower())
|
||||
if model_dir_name is not None:
|
||||
item["installed"] = str(
|
||||
is_exists(model_dir_name, item["filename"], item["url"])
|
||||
)
|
||||
else:
|
||||
item["installed"] = "False"
|
||||
else:
|
||||
model_dir_name = item["save_path"].split("/")[0]
|
||||
if model_dir_name in folder_paths.folder_names_and_paths:
|
||||
if is_exists(model_dir_name, item["filename"], item["url"]):
|
||||
item["installed"] = "True"
|
||||
|
||||
if "installed" not in item:
|
||||
if item["filename"] == "<huggingface>":
|
||||
filename = os.path.basename(item["url"])
|
||||
else:
|
||||
filename = item["filename"]
|
||||
|
||||
fullpath = os.path.join(
|
||||
folder_paths.models_dir, item["save_path"], filename
|
||||
)
|
||||
|
||||
item["installed"] = "True" if os.path.exists(fullpath) else "False"
|
||||
|
||||
with concurrent.futures.ThreadPoolExecutor(8) as executor:
|
||||
for item in json_obj["models"]:
|
||||
executor.submit(process_model_phase, item)
|
||||
|
||||
|
||||
async def check_whitelist_for_model(item):
|
||||
from comfyui_manager.data_models import ManagerDatabaseSource
|
||||
|
||||
json_obj = await core.get_data_by_mode(ManagerDatabaseSource.cache.value, "model-list.json")
|
||||
|
||||
for x in json_obj.get("models", []):
|
||||
if (
|
||||
x["save_path"] == item["save_path"]
|
||||
and x["base"] == item["base"]
|
||||
and x["filename"] == item["filename"]
|
||||
):
|
||||
return True
|
||||
|
||||
json_obj = await core.get_data_by_mode(ManagerDatabaseSource.local.value, "model-list.json")
|
||||
|
||||
for x in json_obj.get("models", []):
|
||||
if (
|
||||
x["save_path"] == item["save_path"]
|
||||
and x["base"] == item["base"]
|
||||
and x["filename"] == item["filename"]
|
||||
):
|
||||
return True
|
||||
|
||||
return False
|
||||
65
comfyui_manager/glob/utils/node_pack_utils.py
Normal file
65
comfyui_manager/glob/utils/node_pack_utils.py
Normal file
@@ -0,0 +1,65 @@
|
||||
import concurrent.futures
|
||||
|
||||
from comfyui_manager.glob import manager_core as core
|
||||
|
||||
|
||||
def check_state_of_git_node_pack(
|
||||
node_packs, do_fetch=False, do_update_check=True, do_update=False
|
||||
):
|
||||
if do_fetch:
|
||||
print("Start fetching...", end="")
|
||||
elif do_update:
|
||||
print("Start updating...", end="")
|
||||
elif do_update_check:
|
||||
print("Start update check...", end="")
|
||||
|
||||
def process_custom_node(item):
|
||||
core.check_state_of_git_node_pack_single(
|
||||
item, do_fetch, do_update_check, do_update
|
||||
)
|
||||
|
||||
with concurrent.futures.ThreadPoolExecutor(4) as executor:
|
||||
for k, v in node_packs.items():
|
||||
if v.get("active_version") in ["unknown", "nightly"]:
|
||||
executor.submit(process_custom_node, v)
|
||||
|
||||
if do_fetch:
|
||||
print("\x1b[2K\rFetching done.")
|
||||
elif do_update:
|
||||
update_exists = any(
|
||||
item.get("updatable", False) for item in node_packs.values()
|
||||
)
|
||||
if update_exists:
|
||||
print("\x1b[2K\rUpdate done.")
|
||||
else:
|
||||
print("\x1b[2K\rAll extensions are already up-to-date.")
|
||||
elif do_update_check:
|
||||
print("\x1b[2K\rUpdate check done.")
|
||||
|
||||
|
||||
def nickname_filter(json_obj):
|
||||
preemptions_map = {}
|
||||
|
||||
for k, x in json_obj.items():
|
||||
if "preemptions" in x[1]:
|
||||
for y in x[1]["preemptions"]:
|
||||
preemptions_map[y] = k
|
||||
elif k.endswith("/ComfyUI"):
|
||||
for y in x[0]:
|
||||
preemptions_map[y] = k
|
||||
|
||||
updates = {}
|
||||
for k, x in json_obj.items():
|
||||
removes = set()
|
||||
for y in x[0]:
|
||||
k2 = preemptions_map.get(y)
|
||||
if k2 is not None and k != k2:
|
||||
removes.add(y)
|
||||
|
||||
if len(removes) > 0:
|
||||
updates[k] = [y for y in x[0] if y not in removes]
|
||||
|
||||
for k, v in updates.items():
|
||||
json_obj[k][0] = v
|
||||
|
||||
return json_obj
|
||||
67
comfyui_manager/glob/utils/security_utils.py
Normal file
67
comfyui_manager/glob/utils/security_utils.py
Normal file
@@ -0,0 +1,67 @@
|
||||
from comfyui_manager.glob import manager_core as core
|
||||
from comfy.cli_args import args
|
||||
from comfyui_manager.data_models import SecurityLevel, RiskLevel, ManagerDatabaseSource
|
||||
|
||||
|
||||
def is_loopback(address):
|
||||
import ipaddress
|
||||
try:
|
||||
return ipaddress.ip_address(address).is_loopback
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
def is_allowed_security_level(level):
|
||||
is_local_mode = is_loopback(args.listen)
|
||||
is_personal_cloud = core.get_config()['network_mode'].lower() == 'personal_cloud'
|
||||
|
||||
if level == RiskLevel.block.value:
|
||||
return False
|
||||
elif level == RiskLevel.high_.value:
|
||||
if is_local_mode:
|
||||
return core.get_config()['security_level'] in [SecurityLevel.weak.value, SecurityLevel.normal_.value]
|
||||
elif is_personal_cloud:
|
||||
return core.get_config()['security_level'] == SecurityLevel.weak.value
|
||||
else:
|
||||
return False
|
||||
elif level == RiskLevel.high.value:
|
||||
if is_local_mode:
|
||||
return core.get_config()['security_level'] in [SecurityLevel.weak.value, SecurityLevel.normal_.value]
|
||||
else:
|
||||
return core.get_config()['security_level'] == SecurityLevel.weak.value
|
||||
elif level == RiskLevel.middle_.value:
|
||||
if is_local_mode or is_personal_cloud:
|
||||
return core.get_config()['security_level'] in [SecurityLevel.weak.value, SecurityLevel.normal.value, SecurityLevel.normal_.value]
|
||||
else:
|
||||
return False
|
||||
elif level == RiskLevel.middle.value:
|
||||
return core.get_config()['security_level'] in [SecurityLevel.weak.value, SecurityLevel.normal.value, SecurityLevel.normal_.value]
|
||||
else:
|
||||
return True
|
||||
|
||||
|
||||
async def get_risky_level(files, pip_packages):
|
||||
json_data1 = await core.get_data_by_mode(ManagerDatabaseSource.local.value, "custom-node-list.json")
|
||||
json_data2 = await core.get_data_by_mode(
|
||||
ManagerDatabaseSource.cache.value,
|
||||
"custom-node-list.json",
|
||||
channel_url="https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main",
|
||||
)
|
||||
|
||||
all_urls = set()
|
||||
for x in json_data1["custom_nodes"] + json_data2["custom_nodes"]:
|
||||
all_urls.update(x.get("files", []))
|
||||
|
||||
for x in files:
|
||||
if x not in all_urls:
|
||||
return RiskLevel.high_.value
|
||||
|
||||
all_pip_packages = set()
|
||||
for x in json_data1["custom_nodes"] + json_data2["custom_nodes"]:
|
||||
all_pip_packages.update(x.get("pip", []))
|
||||
|
||||
for p in pip_packages:
|
||||
if p not in all_pip_packages:
|
||||
return RiskLevel.block.value
|
||||
|
||||
return RiskLevel.middle_.value
|
||||
50
comfyui_manager/js/README.md
Normal file
50
comfyui_manager/js/README.md
Normal file
@@ -0,0 +1,50 @@
|
||||
# ComfyUI-Manager: Frontend (js)
|
||||
|
||||
This directory contains the JavaScript frontend implementation for ComfyUI-Manager, providing the user interface components that interact with the backend API.
|
||||
|
||||
## Core Components
|
||||
|
||||
- **comfyui-manager.js**: Main entry point that initializes the manager UI and integrates with ComfyUI.
|
||||
- **custom-nodes-manager.js**: Implements the UI for browsing, installing, and managing custom nodes.
|
||||
- **model-manager.js**: Handles the model management interface for downloading and organizing AI models.
|
||||
- **components-manager.js**: Manages reusable workflow components system.
|
||||
- **snapshot.js**: Implements the snapshot system for backing up and restoring installations.
|
||||
|
||||
## Sharing Components
|
||||
|
||||
- **comfyui-share-common.js**: Base functionality for workflow sharing features.
|
||||
- **comfyui-share-copus.js**: Integration with the ComfyUI Opus sharing platform.
|
||||
- **comfyui-share-openart.js**: Integration with the OpenArt sharing platform.
|
||||
- **comfyui-share-youml.js**: Integration with the YouML sharing platform.
|
||||
|
||||
## Utility Components
|
||||
|
||||
- **cm-api.js**: Client-side API wrapper for communication with the backend.
|
||||
- **common.js**: Shared utilities and helper functions used across the frontend.
|
||||
- **node_fixer.js**: Utilities for fixing disconnected links and repairing malformed nodes by recreating them while preserving connections.
|
||||
- **popover-helper.js**: UI component for popup tooltips and contextual information.
|
||||
- **turbogrid.esm.js**: Grid component library - https://github.com/cenfun/turbogrid
|
||||
- **workflow-metadata.js**: Handles workflow metadata parsing, validation and cross-repository compatibility including versioning, dependencies tracking, and resource management.
|
||||
|
||||
## Architecture
|
||||
|
||||
The frontend follows a modular component-based architecture:
|
||||
|
||||
1. **Integration Layer**: Connects with ComfyUI's existing UI system
|
||||
2. **Manager Components**: Individual functional UI components (node manager, model manager, etc.)
|
||||
3. **Sharing Components**: Platform-specific sharing implementations
|
||||
4. **Utility Layer**: Reusable UI components and helpers
|
||||
|
||||
## Implementation Details
|
||||
|
||||
- The frontend integrates directly with ComfyUI's UI system through `app.js`
|
||||
- Dialog-based UI for most manager functions to avoid cluttering the main interface
|
||||
- Asynchronous API calls to handle backend operations without blocking the UI
|
||||
|
||||
## Styling
|
||||
|
||||
CSS files are included for specific components:
|
||||
- **custom-nodes-manager.css**: Styling for the node management UI
|
||||
- **model-manager.css**: Styling for the model management UI
|
||||
|
||||
This frontend implementation provides a comprehensive yet user-friendly interface for managing the ComfyUI ecosystem.
|
||||
@@ -25,7 +25,7 @@ async function tryInstallCustomNode(event) {
|
||||
const res = await customConfirm(msg);
|
||||
if(res) {
|
||||
if(event.detail.target.installed == 'Disabled') {
|
||||
const response = await api.fetchApi(`/customnode/toggle_active`, {
|
||||
const response = await api.fetchApi(`/v2/customnode/toggle_active`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(event.detail.target)
|
||||
@@ -35,7 +35,7 @@ async function tryInstallCustomNode(event) {
|
||||
await sleep(300);
|
||||
app.ui.dialog.show(`Installing... '${event.detail.target.title}'`);
|
||||
|
||||
const response = await api.fetchApi(`/customnode/install`, {
|
||||
const response = await api.fetchApi(`/v2/customnode/install`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(event.detail.target)
|
||||
@@ -52,7 +52,7 @@ async function tryInstallCustomNode(event) {
|
||||
}
|
||||
}
|
||||
|
||||
let response = await api.fetchApi("/manager/reboot");
|
||||
let response = await api.fetchApi("/v2/manager/reboot");
|
||||
if(response.status == 403) {
|
||||
show_message('This action is not allowed with this security level configuration.');
|
||||
return false;
|
||||
@@ -14,9 +14,9 @@ import { OpenArtShareDialog } from "./comfyui-share-openart.js";
|
||||
import {
|
||||
free_models, install_pip, install_via_git_url, manager_instance,
|
||||
rebootAPI, setManagerInstance, show_message, customAlert, customPrompt,
|
||||
infoToast, showTerminal, setNeedRestart
|
||||
infoToast, showTerminal, setNeedRestart, generateUUID
|
||||
} from "./common.js";
|
||||
import { ComponentBuilderDialog, getPureName, load_components, set_component_policy } from "./components-manager.js";
|
||||
import { ComponentBuilderDialog, load_components, set_component_policy } from "./components-manager.js";
|
||||
import { CustomNodesManager } from "./custom-nodes-manager.js";
|
||||
import { ModelManager } from "./model-manager.js";
|
||||
import { SnapshotManager } from "./snapshot.js";
|
||||
@@ -189,8 +189,7 @@ docStyle.innerHTML = `
|
||||
}
|
||||
`;
|
||||
|
||||
function is_legacy_front() {
|
||||
let compareVersion = '1.2.49';
|
||||
function isBeforeFrontendVersion(compareVersion) {
|
||||
try {
|
||||
const frontendVersion = window['__COMFYUI_FRONTEND_VERSION__'];
|
||||
if (typeof frontendVersion !== 'string') {
|
||||
@@ -232,7 +231,7 @@ var restart_stop_button = null;
|
||||
var update_policy_combo = null;
|
||||
|
||||
let share_option = 'all';
|
||||
var is_updating = false;
|
||||
var batch_id = null;
|
||||
|
||||
|
||||
// copied style from https://github.com/pythongosssss/ComfyUI-Custom-Scripts
|
||||
@@ -415,7 +414,7 @@ const style = `
|
||||
`;
|
||||
|
||||
async function init_share_option() {
|
||||
api.fetchApi('/manager/share_option')
|
||||
api.fetchApi('/v2/manager/share_option')
|
||||
.then(response => response.text())
|
||||
.then(data => {
|
||||
share_option = data || 'all';
|
||||
@@ -423,7 +422,7 @@ async function init_share_option() {
|
||||
}
|
||||
|
||||
async function init_notice(notice) {
|
||||
api.fetchApi('/manager/notice')
|
||||
api.fetchApi('/v2/manager/notice')
|
||||
.then(response => response.text())
|
||||
.then(data => {
|
||||
notice.innerHTML = data;
|
||||
@@ -474,14 +473,19 @@ async function updateComfyUI() {
|
||||
let prev_text = update_comfyui_button.innerText;
|
||||
update_comfyui_button.innerText = "Updating ComfyUI...";
|
||||
|
||||
set_inprogress_mode();
|
||||
|
||||
const response = await api.fetchApi('/manager/queue/update_comfyui');
|
||||
|
||||
// set_inprogress_mode();
|
||||
showTerminal();
|
||||
|
||||
is_updating = true;
|
||||
await api.fetchApi('/manager/queue/start');
|
||||
batch_id = generateUUID();
|
||||
|
||||
let batch = {};
|
||||
batch['batch_id'] = batch_id;
|
||||
batch['update_comfyui'] = true;
|
||||
|
||||
const res = await api.fetchApi(`/v2/manager/queue/batch`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(batch)
|
||||
});
|
||||
}
|
||||
|
||||
function showVersionSelectorDialog(versions, current, onSelect) {
|
||||
@@ -612,7 +616,7 @@ async function switchComfyUI() {
|
||||
switch_comfyui_button.disabled = true;
|
||||
switch_comfyui_button.style.backgroundColor = "gray";
|
||||
|
||||
let res = await api.fetchApi(`/comfyui_manager/comfyui_versions`, { cache: "no-store" });
|
||||
let res = await api.fetchApi(`/v2/comfyui_manager/comfyui_versions`, { cache: "no-store" });
|
||||
|
||||
switch_comfyui_button.disabled = false;
|
||||
switch_comfyui_button.style.backgroundColor = "";
|
||||
@@ -631,14 +635,14 @@ async function switchComfyUI() {
|
||||
showVersionSelectorDialog(versions, obj.current, async (selected_version) => {
|
||||
if(selected_version == 'nightly') {
|
||||
update_policy_combo.value = 'nightly-comfyui';
|
||||
api.fetchApi('/manager/policy/update?value=nightly-comfyui');
|
||||
api.fetchApi('/v2/manager/policy/update?value=nightly-comfyui');
|
||||
}
|
||||
else {
|
||||
update_policy_combo.value = 'stable-comfyui';
|
||||
api.fetchApi('/manager/policy/update?value=stable-comfyui');
|
||||
api.fetchApi('/v2/manager/policy/update?value=stable-comfyui');
|
||||
}
|
||||
|
||||
let response = await api.fetchApi(`/comfyui_manager/comfyui_switch_version?ver=${selected_version}`, { cache: "no-store" });
|
||||
let response = await api.fetchApi(`/v2/comfyui_manager/comfyui_switch_version?ver=${selected_version}`, { cache: "no-store" });
|
||||
if (response.status == 200) {
|
||||
infoToast(`ComfyUI version is switched to ${selected_version}`);
|
||||
}
|
||||
@@ -656,18 +660,17 @@ async function onQueueStatus(event) {
|
||||
const isElectron = 'electronAPI' in window;
|
||||
|
||||
if(event.detail.status == 'in_progress') {
|
||||
set_inprogress_mode();
|
||||
// set_inprogress_mode();
|
||||
update_all_button.innerText = `in progress.. (${event.detail.done_count}/${event.detail.total_count})`;
|
||||
}
|
||||
else if(event.detail.status == 'done') {
|
||||
else if(event.detail.status == 'all-done') {
|
||||
reset_action_buttons();
|
||||
|
||||
if(!is_updating) {
|
||||
}
|
||||
else if(event.detail.status == 'batch-done') {
|
||||
if(batch_id != event.detail.batch_id) {
|
||||
return;
|
||||
}
|
||||
|
||||
is_updating = false;
|
||||
|
||||
let success_list = [];
|
||||
let failed_list = [];
|
||||
let comfyui_state = null;
|
||||
@@ -767,41 +770,28 @@ api.addEventListener("cm-queue-status", onQueueStatus);
|
||||
async function updateAll(update_comfyui) {
|
||||
update_all_button.innerText = "Updating...";
|
||||
|
||||
set_inprogress_mode();
|
||||
// set_inprogress_mode();
|
||||
|
||||
var mode = manager_instance.datasrc_combo.value;
|
||||
|
||||
showTerminal();
|
||||
|
||||
batch_id = generateUUID();
|
||||
|
||||
let batch = {};
|
||||
if(update_comfyui) {
|
||||
update_all_button.innerText = "Updating ComfyUI...";
|
||||
await api.fetchApi('/manager/queue/update_comfyui');
|
||||
batch['update_comfyui'] = true;
|
||||
}
|
||||
|
||||
const response = await api.fetchApi(`/manager/queue/update_all?mode=${mode}`);
|
||||
batch['update_all'] = mode;
|
||||
|
||||
if (response.status == 401) {
|
||||
customAlert('Another task is already in progress. Please stop the ongoing task first.');
|
||||
}
|
||||
else if(response.status == 200) {
|
||||
is_updating = true;
|
||||
await api.fetchApi('/manager/queue/start');
|
||||
}
|
||||
const res = await api.fetchApi(`/v2/manager/queue/batch`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(batch)
|
||||
});
|
||||
}
|
||||
|
||||
function newDOMTokenList(initialTokens) {
|
||||
const tmp = document.createElement(`div`);
|
||||
|
||||
const classList = tmp.classList;
|
||||
if (initialTokens) {
|
||||
initialTokens.forEach(token => {
|
||||
classList.add(token);
|
||||
});
|
||||
}
|
||||
|
||||
return classList;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check whether the node is a potential output node (img, gif or video output)
|
||||
*/
|
||||
@@ -814,7 +804,7 @@ function restartOrStop() {
|
||||
rebootAPI();
|
||||
}
|
||||
else {
|
||||
api.fetchApi('/manager/queue/reset');
|
||||
api.fetchApi('/v2/manager/queue/reset');
|
||||
infoToast('Cancel', 'Remaining tasks will stop after completing the current task.');
|
||||
}
|
||||
}
|
||||
@@ -962,12 +952,12 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
this.datasrc_combo.appendChild($el('option', { value: 'local', text: 'DB: Local' }, []));
|
||||
this.datasrc_combo.appendChild($el('option', { value: 'remote', text: 'DB: Channel (remote)' }, []));
|
||||
|
||||
api.fetchApi('/manager/db_mode')
|
||||
api.fetchApi('/v2/manager/db_mode')
|
||||
.then(response => response.text())
|
||||
.then(data => { this.datasrc_combo.value = data; });
|
||||
|
||||
this.datasrc_combo.addEventListener('change', function (event) {
|
||||
api.fetchApi(`/manager/db_mode?value=${event.target.value}`);
|
||||
api.fetchApi(`/v2/manager/db_mode?value=${event.target.value}`);
|
||||
});
|
||||
|
||||
// preview method
|
||||
@@ -979,19 +969,19 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
preview_combo.appendChild($el('option', { value: 'latent2rgb', text: 'Preview method: Latent2RGB (fast)' }, []));
|
||||
preview_combo.appendChild($el('option', { value: 'none', text: 'Preview method: None (very fast)' }, []));
|
||||
|
||||
api.fetchApi('/manager/preview_method')
|
||||
api.fetchApi('/v2/manager/preview_method')
|
||||
.then(response => response.text())
|
||||
.then(data => { preview_combo.value = data; });
|
||||
|
||||
preview_combo.addEventListener('change', function (event) {
|
||||
api.fetchApi(`/manager/preview_method?value=${event.target.value}`);
|
||||
api.fetchApi(`/v2/manager/preview_method?value=${event.target.value}`);
|
||||
});
|
||||
|
||||
// channel
|
||||
let channel_combo = document.createElement("select");
|
||||
channel_combo.setAttribute("title", "Configure the channel for retrieving data from the Custom Node list (including missing nodes) or the Model list.");
|
||||
channel_combo.className = "cm-menu-combo";
|
||||
api.fetchApi('/manager/channel_url_list')
|
||||
api.fetchApi('/v2/manager/channel_url_list')
|
||||
.then(response => response.json())
|
||||
.then(async data => {
|
||||
try {
|
||||
@@ -1004,7 +994,7 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
}
|
||||
|
||||
channel_combo.addEventListener('change', function (event) {
|
||||
api.fetchApi(`/manager/channel_url_list?value=${event.target.value}`);
|
||||
api.fetchApi(`/v2/manager/channel_url_list?value=${event.target.value}`);
|
||||
});
|
||||
|
||||
channel_combo.value = data.selected;
|
||||
@@ -1032,7 +1022,7 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
share_combo.appendChild($el('option', { value: option[0], text: `Share: ${option[1]}` }, []));
|
||||
}
|
||||
|
||||
api.fetchApi('/manager/share_option')
|
||||
api.fetchApi('/v2/manager/share_option')
|
||||
.then(response => response.text())
|
||||
.then(data => {
|
||||
share_combo.value = data || 'all';
|
||||
@@ -1042,7 +1032,7 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
share_combo.addEventListener('change', function (event) {
|
||||
const value = event.target.value;
|
||||
share_option = value;
|
||||
api.fetchApi(`/manager/share_option?value=${value}`);
|
||||
api.fetchApi(`/v2/manager/share_option?value=${value}`);
|
||||
const shareButton = document.getElementById("shareButton");
|
||||
if (value === 'none') {
|
||||
shareButton.style.display = "none";
|
||||
@@ -1057,7 +1047,7 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
component_policy_combo.appendChild($el('option', { value: 'workflow', text: 'Component: Use workflow version' }, []));
|
||||
component_policy_combo.appendChild($el('option', { value: 'higher', text: 'Component: Use higher version' }, []));
|
||||
component_policy_combo.appendChild($el('option', { value: 'mine', text: 'Component: Use my version' }, []));
|
||||
api.fetchApi('/manager/policy/component')
|
||||
api.fetchApi('/v2/manager/policy/component')
|
||||
.then(response => response.text())
|
||||
.then(data => {
|
||||
component_policy_combo.value = data;
|
||||
@@ -1065,7 +1055,7 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
});
|
||||
|
||||
component_policy_combo.addEventListener('change', function (event) {
|
||||
api.fetchApi(`/manager/policy/component?value=${event.target.value}`);
|
||||
api.fetchApi(`/v2/manager/policy/component?value=${event.target.value}`);
|
||||
set_component_policy(event.target.value);
|
||||
});
|
||||
|
||||
@@ -1078,14 +1068,14 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
update_policy_combo.className = "cm-menu-combo";
|
||||
update_policy_combo.appendChild($el('option', { value: 'stable-comfyui', text: 'Update: ComfyUI Stable Version' }, []));
|
||||
update_policy_combo.appendChild($el('option', { value: 'nightly-comfyui', text: 'Update: ComfyUI Nightly Version' }, []));
|
||||
api.fetchApi('/manager/policy/update')
|
||||
api.fetchApi('/v2/manager/policy/update')
|
||||
.then(response => response.text())
|
||||
.then(data => {
|
||||
update_policy_combo.value = data;
|
||||
});
|
||||
|
||||
update_policy_combo.addEventListener('change', function (event) {
|
||||
api.fetchApi(`/manager/policy/update?value=${event.target.value}`);
|
||||
api.fetchApi(`/v2/manager/policy/update?value=${event.target.value}`);
|
||||
});
|
||||
|
||||
return [
|
||||
@@ -1388,12 +1378,12 @@ class ManagerMenuDialog extends ComfyDialog {
|
||||
}
|
||||
|
||||
async function getVersion() {
|
||||
let version = await api.fetchApi(`/manager/version`);
|
||||
let version = await api.fetchApi(`/v2/manager/version`);
|
||||
return await version.text();
|
||||
}
|
||||
|
||||
app.registerExtension({
|
||||
name: "Comfy.ManagerMenu",
|
||||
name: "Comfy.Legacy.ManagerMenu",
|
||||
|
||||
aboutPageBadges: [
|
||||
{
|
||||
@@ -1524,8 +1514,6 @@ app.registerExtension({
|
||||
tooltip: "Share"
|
||||
}).element
|
||||
);
|
||||
|
||||
app.menu?.settingsGroup.element.before(cmGroup.element);
|
||||
}
|
||||
catch(exception) {
|
||||
console.log('ComfyUI is outdated. New style menu based features are disabled.');
|
||||
@@ -172,7 +172,7 @@ export const shareToEsheep= () => {
|
||||
const nodes = app.graph._nodes
|
||||
const { potential_outputs, potential_output_nodes } = getPotentialOutputsAndOutputNodes(nodes);
|
||||
const workflow = prompt['workflow']
|
||||
api.fetchApi(`/manager/set_esheep_workflow_and_images`, {
|
||||
api.fetchApi(`/v2/manager/set_esheep_workflow_and_images`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
@@ -552,6 +552,20 @@ export class ShareDialog extends ComfyDialog {
|
||||
this.matrix_destination_checkbox.style.color = "var(--fg-color)";
|
||||
this.matrix_destination_checkbox.checked = this.share_option === 'matrix'; //true;
|
||||
|
||||
try {
|
||||
api.fetchApi(`/v2/manager/get_matrix_dep_status`)
|
||||
.then(response => response.text())
|
||||
.then(data => {
|
||||
if(data == 'unavailable') {
|
||||
matrix_destination_checkbox_text.style.textDecoration = "line-through";
|
||||
this.matrix_destination_checkbox.disabled = true;
|
||||
this.matrix_destination_checkbox.title = "It has been disabled because the 'matrix-nio' dependency is not installed. Please install this dependency to use the matrix sharing feature.";
|
||||
matrix_destination_checkbox_text.title = "It has been disabled because the 'matrix-nio' dependency is not installed. Please install this dependency to use the matrix sharing feature.";
|
||||
}
|
||||
})
|
||||
.catch(error => {});
|
||||
} catch (error) {}
|
||||
|
||||
this.comfyworkflows_destination_checkbox = $el("input", { type: 'checkbox', id: "comfyworkflows_destination" }, [])
|
||||
const comfyworkflows_destination_checkbox_text = $el("label", {}, [" ComfyWorkflows.com"])
|
||||
this.comfyworkflows_destination_checkbox.style.color = "var(--fg-color)";
|
||||
@@ -812,7 +826,7 @@ export class ShareDialog extends ComfyDialog {
|
||||
// get the user's existing matrix auth and share key
|
||||
ShareDialog.matrix_auth = { homeserver: "matrix.org", username: "", password: "" };
|
||||
try {
|
||||
api.fetchApi(`/manager/get_matrix_auth`)
|
||||
api.fetchApi(`/v2/manager/get_matrix_auth`)
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
ShareDialog.matrix_auth = data;
|
||||
@@ -831,7 +845,7 @@ export class ShareDialog extends ComfyDialog {
|
||||
ShareDialog.cw_sharekey = "";
|
||||
try {
|
||||
// console.log("Fetching comfyworkflows share key")
|
||||
api.fetchApi(`/manager/get_comfyworkflows_auth`)
|
||||
api.fetchApi(`/v2/manager/get_comfyworkflows_auth`)
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
ShareDialog.cw_sharekey = data.comfyworkflows_sharekey;
|
||||
@@ -891,7 +905,7 @@ export class ShareDialog extends ComfyDialog {
|
||||
// Change the text of the share button to "Sharing..." to indicate that the share process has started
|
||||
this.share_button.textContent = "Sharing...";
|
||||
|
||||
const response = await api.fetchApi(`/manager/share`, {
|
||||
const response = await api.fetchApi(`/v2/manager/share`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
@@ -71,7 +71,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
this.allFiles = [];
|
||||
this.titleNum = 0;
|
||||
}
|
||||
|
||||
|
||||
createButtons() {
|
||||
const inputStyle = {
|
||||
display: "block",
|
||||
@@ -201,13 +201,15 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
});
|
||||
this.LockInput = $el("input", {
|
||||
type: "text",
|
||||
placeholder: "",
|
||||
style: {
|
||||
placeholder: "0",
|
||||
style: {
|
||||
width: "100px",
|
||||
padding: "7px",
|
||||
paddingLeft: "30px",
|
||||
borderRadius: "4px",
|
||||
border: "1px solid #ddd",
|
||||
boxSizing: "border-box",
|
||||
position: "relative",
|
||||
},
|
||||
oninput: (event) => {
|
||||
let input = event.target.value;
|
||||
@@ -301,7 +303,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
},
|
||||
[]
|
||||
);
|
||||
|
||||
|
||||
const titleNumDom = $el(
|
||||
"label",
|
||||
{
|
||||
@@ -342,15 +344,11 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
["0/70"]
|
||||
);
|
||||
// Additional Inputs Section
|
||||
const additionalInputsSection = $el(
|
||||
"div",
|
||||
{ style: { ...sectionStyle, } },
|
||||
[
|
||||
$el("label", { style: labelStyle }, ["3️⃣ Title "]),
|
||||
this.TitleInput,
|
||||
titleNumDom,
|
||||
]
|
||||
);
|
||||
const additionalInputsSection = $el("div", { style: { ...sectionStyle } }, [
|
||||
$el("label", { style: labelStyle }, ["3️⃣ Title "]),
|
||||
this.TitleInput,
|
||||
titleNumDom,
|
||||
]);
|
||||
const SubtitleSection = $el("div", { style: sectionStyle }, [
|
||||
$el("label", { style: labelStyle }, ["4️⃣ Subtitle "]),
|
||||
this.SubTitleInput,
|
||||
@@ -379,7 +377,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
});
|
||||
|
||||
const blockChainSection_lock = $el("div", { style: sectionStyle }, [
|
||||
$el("label", { style: labelStyle }, ["6️⃣ Pay to download"]),
|
||||
$el("label", { style: labelStyle }, ["6️⃣ Download threshold"]),
|
||||
$el(
|
||||
"label",
|
||||
{
|
||||
@@ -392,11 +390,42 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
},
|
||||
[
|
||||
this.radioButtonsCheck_lock,
|
||||
$el("div", { style: { marginLeft: "5px" ,display:'flex',alignItems:'center'} }, [
|
||||
$el("span", { style: { marginLeft: "5px" } }, ["ON"]),
|
||||
$el("span", { style: { marginLeft: "20px",marginRight:'10px' ,color:'#fff'} }, ["Price US$"]),
|
||||
this.LockInput
|
||||
]),
|
||||
$el(
|
||||
"div",
|
||||
{
|
||||
style: {
|
||||
marginLeft: "5px",
|
||||
display: "flex",
|
||||
alignItems: "center",
|
||||
position: "relative",
|
||||
},
|
||||
},
|
||||
[
|
||||
$el("span", { style: { marginLeft: "5px" } }, ["ON"]),
|
||||
$el(
|
||||
"span",
|
||||
{
|
||||
style: {
|
||||
marginLeft: "20px",
|
||||
marginRight: "10px",
|
||||
color: "#fff",
|
||||
},
|
||||
},
|
||||
["Unlock with"]
|
||||
),
|
||||
$el("img", {
|
||||
style: {
|
||||
width: "16px",
|
||||
height: "16px",
|
||||
position: "absolute",
|
||||
right: "75px",
|
||||
zIndex: "100",
|
||||
},
|
||||
src: "https://static.copus.io/images/admin/202507/prod/e2919a1d8f3c2d99d3b8fe27ff94b841.png",
|
||||
}),
|
||||
this.LockInput,
|
||||
]
|
||||
),
|
||||
]
|
||||
),
|
||||
$el(
|
||||
@@ -404,14 +433,25 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
{ style: { display: "flex", alignItems: "center", cursor: "pointer" } },
|
||||
[
|
||||
this.radioButtonsCheckOff_lock,
|
||||
$el("span", { style: { marginLeft: "5px" } }, ["OFF"]),
|
||||
$el(
|
||||
"div",
|
||||
{
|
||||
style: {
|
||||
marginLeft: "5px",
|
||||
display: "flex",
|
||||
alignItems: "center",
|
||||
},
|
||||
},
|
||||
[$el("span", { style: { marginLeft: "5px" } }, ["OFF"])]
|
||||
),
|
||||
]
|
||||
),
|
||||
|
||||
|
||||
$el(
|
||||
"p",
|
||||
{ style: { fontSize: "16px", color: "#fff", margin: "10px 0 0 0" } },
|
||||
["Get paid from your workflow. You can change the price and withdraw your earnings on Copus."]
|
||||
[
|
||||
]
|
||||
),
|
||||
]);
|
||||
|
||||
@@ -432,7 +472,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
});
|
||||
|
||||
const blockChainSection = $el("div", { style: sectionStyle }, [
|
||||
$el("label", { style: labelStyle }, ["7️⃣ Store on blockchain "]),
|
||||
$el("label", { style: labelStyle }, ["8️⃣ Store on blockchain "]),
|
||||
$el(
|
||||
"label",
|
||||
{
|
||||
@@ -463,6 +503,139 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
),
|
||||
]);
|
||||
|
||||
this.ratingRadioButtonsCheck0 = $el("input", {
|
||||
type: "radio",
|
||||
name: "content_rating",
|
||||
value: "0",
|
||||
id: "content_rating0",
|
||||
});
|
||||
this.ratingRadioButtonsCheck1 = $el("input", {
|
||||
type: "radio",
|
||||
name: "content_rating",
|
||||
value: "1",
|
||||
id: "content_rating1",
|
||||
});
|
||||
this.ratingRadioButtonsCheck2 = $el("input", {
|
||||
type: "radio",
|
||||
name: "content_rating",
|
||||
value: "2",
|
||||
id: "content_rating2",
|
||||
});
|
||||
this.ratingRadioButtonsCheck_1 = $el("input", {
|
||||
type: "radio",
|
||||
name: "content_rating",
|
||||
value: "-1",
|
||||
id: "content_rating_1",
|
||||
checked: true,
|
||||
});
|
||||
|
||||
// content rating
|
||||
const contentRatingSection = $el("div", { style: sectionStyle }, [
|
||||
$el("label", { style: labelStyle }, ["7️⃣ Content rating "]),
|
||||
$el(
|
||||
"label",
|
||||
{
|
||||
style: {
|
||||
marginTop: "10px",
|
||||
display: "flex",
|
||||
alignItems: "center",
|
||||
cursor: "pointer",
|
||||
},
|
||||
},
|
||||
[
|
||||
this.ratingRadioButtonsCheck0,
|
||||
$el("img", {
|
||||
style: {
|
||||
width: "12px",
|
||||
height: "12px",
|
||||
marginLeft: "5px",
|
||||
},
|
||||
src: "https://static.copus.io/images/client/202507/test/b9f17da83b054d53cd0cb4508c2c30dc.png",
|
||||
}),
|
||||
$el("span", { style: { marginLeft: "5px", color: "#fff" } }, [
|
||||
"All ages",
|
||||
]),
|
||||
]
|
||||
),
|
||||
$el(
|
||||
"p",
|
||||
{ style: { fontSize: "10px", color: "#fff", marginLeft: "20px" } },
|
||||
["Safe for all viewers; no profanity, violence, or mature themes."]
|
||||
),
|
||||
$el(
|
||||
"label",
|
||||
{ style: { display: "flex", alignItems: "center", cursor: "pointer" } },
|
||||
[
|
||||
this.ratingRadioButtonsCheck1,
|
||||
$el("img", {
|
||||
style: {
|
||||
width: "12px",
|
||||
height: "12px",
|
||||
marginLeft: "5px",
|
||||
},
|
||||
src: "https://static.copus.io/images/client/202507/test/7848bc0d3690671df21c7cf00c4cfc81.png",
|
||||
}),
|
||||
$el("span", { style: { marginLeft: "5px", color: "#fff" } }, [
|
||||
"13+ (Teen)",
|
||||
]),
|
||||
]
|
||||
),
|
||||
$el(
|
||||
"p",
|
||||
{ style: { fontSize: "10px", color: "#fff", marginLeft: "20px" } },
|
||||
[
|
||||
"Mild language, light themes, or cartoon violence; no explicit content. ",
|
||||
]
|
||||
),
|
||||
$el(
|
||||
"label",
|
||||
{ style: { display: "flex", alignItems: "center", cursor: "pointer" } },
|
||||
[
|
||||
this.ratingRadioButtonsCheck2,
|
||||
$el("img", {
|
||||
style: {
|
||||
width: "12px",
|
||||
height: "12px",
|
||||
marginLeft: "5px",
|
||||
},
|
||||
src: "https://static.copus.io/images/client/202507/test/bc51839c208d68d91173e43c23bff039.png",
|
||||
}),
|
||||
$el("span", { style: { marginLeft: "5px", color: "#fff" } }, [
|
||||
"18+ (Explicit)",
|
||||
]),
|
||||
]
|
||||
),
|
||||
$el(
|
||||
"p",
|
||||
{ style: { fontSize: "10px", color: "#fff", marginLeft: "20px" } },
|
||||
[
|
||||
"Explicit content, including sexual content, strong violence, or intense themes. ",
|
||||
]
|
||||
),
|
||||
$el(
|
||||
"label",
|
||||
{ style: { display: "flex", alignItems: "center", cursor: "pointer" } },
|
||||
[
|
||||
this.ratingRadioButtonsCheck_1,
|
||||
$el("img", {
|
||||
style: {
|
||||
width: "12px",
|
||||
height: "12px",
|
||||
marginLeft: "5px",
|
||||
},
|
||||
src: "https://static.copus.io/images/client/202507/test/5c802fdcaaea4e7bbed37393eec0d5ba.png",
|
||||
}),
|
||||
$el("span", { style: { marginLeft: "5px", color: "#fff" } }, [
|
||||
"Not Rated",
|
||||
]),
|
||||
]
|
||||
),
|
||||
$el(
|
||||
"p",
|
||||
{ style: { fontSize: "10px", color: "#fff", marginLeft: "20px" } },
|
||||
["No age rating provided."]
|
||||
),
|
||||
]);
|
||||
|
||||
// Message Section
|
||||
this.message = $el(
|
||||
@@ -526,6 +699,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
DescriptionSection,
|
||||
// contestSection,
|
||||
blockChainSection_lock,
|
||||
contentRatingSection,
|
||||
blockChainSection,
|
||||
this.message,
|
||||
buttonsSection,
|
||||
@@ -534,7 +708,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
return layout;
|
||||
}
|
||||
/**
|
||||
* api
|
||||
* api
|
||||
* @param {url} path
|
||||
* @param {params} options
|
||||
* @param {statusText} statusText
|
||||
@@ -587,7 +761,9 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
url: data,
|
||||
});
|
||||
} else {
|
||||
throw new Error("make sure your API key is correct and try again later");
|
||||
throw new Error(
|
||||
"make sure your API key is correct and try again later"
|
||||
);
|
||||
}
|
||||
} catch (e) {
|
||||
if (e?.response?.status === 413) {
|
||||
@@ -628,8 +804,15 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
subTitle: this.SubTitleInput.value,
|
||||
content: this.descriptionInput.value,
|
||||
storeOnChain: this.radioButtonsCheck.checked ? true : false,
|
||||
lockState:this.radioButtonsCheck_lock.checked ? 2 : 0,
|
||||
unlockPrice:this.LockInput.value,
|
||||
lockState: this.radioButtonsCheck_lock.checked ? 2 : 0,
|
||||
unlockPrice: this.LockInput.value,
|
||||
rating: this.ratingRadioButtonsCheck0.checked
|
||||
? 0
|
||||
: this.ratingRadioButtonsCheck1.checked
|
||||
? 1
|
||||
: this.ratingRadioButtonsCheck2.checked
|
||||
? 2
|
||||
: -1,
|
||||
};
|
||||
|
||||
if (!this.keyInput.value) {
|
||||
@@ -644,8 +827,8 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
throw new Error("Title is required");
|
||||
}
|
||||
|
||||
if(this.radioButtonsCheck_lock.checked){
|
||||
if (!this.LockInput.value){
|
||||
if (this.radioButtonsCheck_lock.checked) {
|
||||
if (!this.LockInput.value) {
|
||||
throw new Error("Price is required");
|
||||
}
|
||||
}
|
||||
@@ -695,23 +878,23 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
"Uploading workflow..."
|
||||
);
|
||||
|
||||
if (res.status && res.data.status && res.data) {
|
||||
localStorage.setItem("copus_token",this.keyInput.value);
|
||||
const { data } = res.data;
|
||||
if (data) {
|
||||
const url = `${DEFAULT_HOMEPAGE_URL}/work/${data}`;
|
||||
this.message.innerHTML = `Workflow has been shared successfully. <a href="${url}" target="_blank">Click here to view it.</a>`;
|
||||
this.previewImage.src = "";
|
||||
this.previewImage.style.display = "none";
|
||||
this.uploadedImages = [];
|
||||
this.allFilesImages = [];
|
||||
this.allFiles = [];
|
||||
this.TitleInput.value = "";
|
||||
this.SubTitleInput.value = "";
|
||||
this.descriptionInput.value = "";
|
||||
this.selectedFile = null;
|
||||
}
|
||||
}
|
||||
if (res.status && res.data.status && res.data) {
|
||||
localStorage.setItem("copus_token", this.keyInput.value);
|
||||
const { data } = res.data;
|
||||
if (data) {
|
||||
const url = `${DEFAULT_HOMEPAGE_URL}/work/${data}`;
|
||||
this.message.innerHTML = `Workflow has been shared successfully. <a href="${url}" target="_blank">Click here to view it.</a>`;
|
||||
this.previewImage.src = "";
|
||||
this.previewImage.style.display = "none";
|
||||
this.uploadedImages = [];
|
||||
this.allFilesImages = [];
|
||||
this.allFiles = [];
|
||||
this.TitleInput.value = "";
|
||||
this.SubTitleInput.value = "";
|
||||
this.descriptionInput.value = "";
|
||||
this.selectedFile = null;
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
throw new Error("Error sharing workflow: " + e.message);
|
||||
}
|
||||
@@ -757,7 +940,7 @@ export class CopusShareDialog extends ComfyDialog {
|
||||
this.element.style.display = "block";
|
||||
this.previewImage.src = "";
|
||||
this.previewImage.style.display = "none";
|
||||
this.keyInput.value = apiToken!=null?apiToken:"";
|
||||
this.keyInput.value = apiToken != null ? apiToken : "";
|
||||
this.uploadedImages = [];
|
||||
this.allFilesImages = [];
|
||||
this.allFiles = [];
|
||||
@@ -67,7 +67,7 @@ export class OpenArtShareDialog extends ComfyDialog {
|
||||
async readKey() {
|
||||
let key = ""
|
||||
try {
|
||||
key = await api.fetchApi(`/manager/get_openart_auth`)
|
||||
key = await api.fetchApi(`/v2/manager/get_openart_auth`)
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
return data.openart_key;
|
||||
@@ -82,7 +82,7 @@ export class OpenArtShareDialog extends ComfyDialog {
|
||||
}
|
||||
|
||||
async saveKey(value) {
|
||||
await api.fetchApi(`/manager/set_openart_auth`, {
|
||||
await api.fetchApi(`/v2/manager/set_openart_auth`, {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({
|
||||
@@ -399,7 +399,7 @@ export class OpenArtShareDialog extends ComfyDialog {
|
||||
form.append("file", uploadFile);
|
||||
try {
|
||||
const res = await this.fetchApi(
|
||||
`/workflows/upload_thumbnail`,
|
||||
`/v2/workflows/upload_thumbnail`,
|
||||
{
|
||||
method: "POST",
|
||||
body: form,
|
||||
@@ -459,7 +459,7 @@ export class OpenArtShareDialog extends ComfyDialog {
|
||||
throw new Error("Title is required");
|
||||
}
|
||||
|
||||
const current_snapshot = await api.fetchApi(`/snapshot/get_current`)
|
||||
const current_snapshot = await api.fetchApi(`/v2/snapshot/get_current`)
|
||||
.then(response => response.json())
|
||||
.catch(error => {
|
||||
// console.log(error);
|
||||
@@ -489,7 +489,7 @@ export class OpenArtShareDialog extends ComfyDialog {
|
||||
|
||||
try {
|
||||
const response = await this.fetchApi(
|
||||
"/workflows/publish",
|
||||
"/v2/workflows/publish",
|
||||
{
|
||||
method: "POST",
|
||||
headers: {"Content-Type": "application/json"},
|
||||
@@ -179,7 +179,7 @@ export class YouMLShareDialog extends ComfyDialog {
|
||||
async loadToken() {
|
||||
let key = ""
|
||||
try {
|
||||
const response = await api.fetchApi(`/manager/youml/settings`)
|
||||
const response = await api.fetchApi(`/v2/manager/youml/settings`)
|
||||
const settings = await response.json()
|
||||
return settings.token
|
||||
} catch (error) {
|
||||
@@ -188,7 +188,7 @@ export class YouMLShareDialog extends ComfyDialog {
|
||||
}
|
||||
|
||||
async saveToken(value) {
|
||||
await api.fetchApi(`/manager/youml/settings`, {
|
||||
await api.fetchApi(`/v2/manager/youml/settings`, {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({
|
||||
@@ -380,7 +380,7 @@ export class YouMLShareDialog extends ComfyDialog {
|
||||
try {
|
||||
let snapshotData = null;
|
||||
try {
|
||||
const snapshot = await api.fetchApi(`/snapshot/get_current`)
|
||||
const snapshot = await api.fetchApi(`/v2/snapshot/get_current`)
|
||||
snapshotData = await snapshot.json()
|
||||
} catch (e) {
|
||||
console.error("Failed to get snapshot", e)
|
||||
@@ -172,7 +172,7 @@ export function rebootAPI() {
|
||||
customConfirm("Are you sure you'd like to reboot the server?").then((isConfirmed) => {
|
||||
if (isConfirmed) {
|
||||
try {
|
||||
api.fetchApi("/manager/reboot");
|
||||
api.fetchApi("/v2/manager/reboot");
|
||||
}
|
||||
catch(exception) {}
|
||||
}
|
||||
@@ -210,7 +210,7 @@ export async function install_pip(packages) {
|
||||
if(packages.includes('&'))
|
||||
app.ui.dialog.show(`Invalid PIP package enumeration: '${packages}'`);
|
||||
|
||||
const res = await api.fetchApi("/customnode/install/pip", {
|
||||
const res = await api.fetchApi("/v2/customnode/install/pip", {
|
||||
method: "POST",
|
||||
body: packages,
|
||||
});
|
||||
@@ -245,7 +245,7 @@ export async function install_via_git_url(url, manager_dialog) {
|
||||
|
||||
show_message(`Wait...<BR><BR>Installing '${url}'`);
|
||||
|
||||
const res = await api.fetchApi("/customnode/install/git_url", {
|
||||
const res = await api.fetchApi("/v2/customnode/install/git_url", {
|
||||
method: "POST",
|
||||
body: url,
|
||||
});
|
||||
@@ -630,6 +630,14 @@ export function showTooltip(target, text, className = 'cn-tooltip', styleMap = {
|
||||
});
|
||||
}
|
||||
|
||||
export function generateUUID() {
|
||||
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
|
||||
const r = Math.random() * 16 | 0;
|
||||
const v = c === 'x' ? r : (r & 0x3 | 0x8);
|
||||
return v.toString(16);
|
||||
});
|
||||
}
|
||||
|
||||
function initTooltip () {
|
||||
const mouseenterHandler = (e) => {
|
||||
const target = e.target;
|
||||
@@ -64,7 +64,7 @@ function storeGroupNode(name, data, register=true) {
|
||||
}
|
||||
|
||||
export async function load_components() {
|
||||
let data = await api.fetchApi('/manager/component/loads', {method: "POST"});
|
||||
let data = await api.fetchApi('/v2/manager/component/loads', {method: "POST"});
|
||||
let components = await data.json();
|
||||
|
||||
let start_time = Date.now();
|
||||
@@ -222,7 +222,7 @@ async function save_as_component(node, version, author, prefix, nodename, packna
|
||||
pack_map[packname] = component_name;
|
||||
rpack_map[component_name] = subgraph;
|
||||
|
||||
const res = await api.fetchApi('/manager/component/save', {
|
||||
const res = await api.fetchApi('/v2/manager/component/save', {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
@@ -259,7 +259,7 @@ async function import_component(component_name, component, mode) {
|
||||
workflow: component
|
||||
};
|
||||
|
||||
const res = await api.fetchApi('/manager/component/save', {
|
||||
const res = await api.fetchApi('/v2/manager/component/save', {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json", },
|
||||
body: JSON.stringify(body)
|
||||
@@ -709,7 +709,7 @@ app.handleFile = handleFile;
|
||||
|
||||
let current_component_policy = 'workflow';
|
||||
try {
|
||||
api.fetchApi('/manager/policy/component')
|
||||
api.fetchApi('/v2/manager/policy/component')
|
||||
.then(response => response.text())
|
||||
.then(data => { current_component_policy = data; });
|
||||
}
|
||||
@@ -7,7 +7,7 @@ import {
|
||||
fetchData, md5, icons, show_message, customConfirm, customAlert, customPrompt,
|
||||
sanitizeHTML, infoToast, showTerminal, setNeedRestart,
|
||||
storeColumnWidth, restoreColumnWidth, getTimeAgo, copyText, loadCss,
|
||||
showPopover, hidePopover
|
||||
showPopover, hidePopover, generateUUID
|
||||
} from "./common.js";
|
||||
|
||||
// https://cenfun.github.io/turbogrid/api.html
|
||||
@@ -66,7 +66,7 @@ export class CustomNodesManager {
|
||||
this.id = "cn-manager";
|
||||
|
||||
app.registerExtension({
|
||||
name: "Comfy.CustomNodesManager",
|
||||
name: "Comfy.Legacy.CustomNodesManager",
|
||||
afterConfigureGraph: (missingNodeTypes) => {
|
||||
const item = this.getFilterItem(ShowMode.MISSING);
|
||||
if (item) {
|
||||
@@ -459,7 +459,7 @@ export class CustomNodesManager {
|
||||
|
||||
".cn-manager-stop": {
|
||||
click: () => {
|
||||
api.fetchApi('/manager/queue/reset');
|
||||
api.fetchApi('/v2/manager/queue/reset');
|
||||
infoToast('Cancel', 'Remaining tasks will stop after completing the current task.');
|
||||
}
|
||||
},
|
||||
@@ -635,7 +635,7 @@ export class CustomNodesManager {
|
||||
};
|
||||
}
|
||||
|
||||
const response = await api.fetchApi(`/customnode/import_fail_info`, {
|
||||
const response = await api.fetchApi(`/v2/customnode/import_fail_info`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(info)
|
||||
@@ -714,6 +714,7 @@ export class CustomNodesManager {
|
||||
link.href = rowItem.reference;
|
||||
link.target = '_blank';
|
||||
link.innerHTML = `<b>${title}</b>`;
|
||||
link.title = rowItem.originalData.id;
|
||||
container.appendChild(link);
|
||||
|
||||
return container;
|
||||
@@ -1243,7 +1244,7 @@ export class CustomNodesManager {
|
||||
async loadNodes(node_packs) {
|
||||
const mode = manager_instance.datasrc_combo.value;
|
||||
this.showStatus(`Loading node mappings (${mode}) ...`);
|
||||
const res = await fetchData(`/customnode/getmappings?mode=${mode}`);
|
||||
const res = await fetchData(`/v2/customnode/getmappings?mode=${mode}`);
|
||||
if (res.error) {
|
||||
console.log(res.error);
|
||||
return;
|
||||
@@ -1395,10 +1396,10 @@ export class CustomNodesManager {
|
||||
this.showLoading();
|
||||
let res;
|
||||
if(is_enable) {
|
||||
res = await api.fetchApi(`/customnode/disabled_versions/${node_id}`, { cache: "no-store" });
|
||||
res = await api.fetchApi(`/v2/customnode/disabled_versions/${node_id}`, { cache: "no-store" });
|
||||
}
|
||||
else {
|
||||
res = await api.fetchApi(`/customnode/versions/${node_id}`, { cache: "no-store" });
|
||||
res = await api.fetchApi(`/v2/customnode/versions/${node_id}`, { cache: "no-store" });
|
||||
}
|
||||
this.hideLoading();
|
||||
|
||||
@@ -1440,13 +1441,6 @@ export class CustomNodesManager {
|
||||
}
|
||||
|
||||
async installNodes(list, btn, title, selected_version) {
|
||||
let stats = await api.fetchApi('/manager/queue/status');
|
||||
stats = await stats.json();
|
||||
if(stats.is_processing) {
|
||||
customAlert(`[ComfyUI-Manager] There are already tasks in progress. Please try again after it is completed. (${stats.done_count}/${stats.total_count})`);
|
||||
return;
|
||||
}
|
||||
|
||||
const { target, label, mode} = btn;
|
||||
|
||||
if(mode === "uninstall") {
|
||||
@@ -1473,10 +1467,10 @@ export class CustomNodesManager {
|
||||
let needRestart = false;
|
||||
let errorMsg = "";
|
||||
|
||||
await api.fetchApi('/manager/queue/reset');
|
||||
|
||||
let target_items = [];
|
||||
|
||||
let batch = {};
|
||||
|
||||
for (const hash of list) {
|
||||
const item = this.grid.getRowItemBy("hash", hash);
|
||||
target_items.push(item);
|
||||
@@ -1518,23 +1512,11 @@ export class CustomNodesManager {
|
||||
api_mode = 'reinstall';
|
||||
}
|
||||
|
||||
const res = await api.fetchApi(`/manager/queue/${api_mode}`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(data)
|
||||
});
|
||||
|
||||
if (res.status != 200) {
|
||||
errorMsg = `'${item.title}': `;
|
||||
|
||||
if(res.status == 403) {
|
||||
errorMsg += `This action is not allowed with this security level configuration.\n`;
|
||||
} else if(res.status == 404) {
|
||||
errorMsg += `With the current security level configuration, only custom nodes from the <B>"default channel"</B> can be installed.\n`;
|
||||
} else {
|
||||
errorMsg += await res.text() + '\n';
|
||||
}
|
||||
|
||||
break;
|
||||
if(batch[api_mode]) {
|
||||
batch[api_mode].push(data);
|
||||
}
|
||||
else {
|
||||
batch[api_mode] = [data];
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1551,7 +1533,24 @@ export class CustomNodesManager {
|
||||
}
|
||||
}
|
||||
else {
|
||||
await api.fetchApi('/manager/queue/start');
|
||||
this.batch_id = generateUUID();
|
||||
batch['batch_id'] = this.batch_id;
|
||||
|
||||
const res = await api.fetchApi(`/v2/manager/queue/batch`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(batch)
|
||||
});
|
||||
|
||||
let failed = await res.json();
|
||||
|
||||
if(failed.length > 0) {
|
||||
for(let k in failed) {
|
||||
let hash = failed[k];
|
||||
const item = this.grid.getRowItemBy("hash", hash);
|
||||
errorMsg = `[FAIL] ${item.title}`;
|
||||
}
|
||||
}
|
||||
|
||||
this.showStop();
|
||||
showTerminal();
|
||||
}
|
||||
@@ -1559,6 +1558,9 @@ export class CustomNodesManager {
|
||||
|
||||
async onQueueStatus(event) {
|
||||
let self = CustomNodesManager.instance;
|
||||
// If legacy manager front is not open, return early (using new manager front)
|
||||
if (self.element?.style.display === 'none') return
|
||||
|
||||
if(event.detail.status == 'in_progress' && event.detail.ui_target == 'nodepack_manager') {
|
||||
const hash = event.detail.target;
|
||||
|
||||
@@ -1569,7 +1571,7 @@ export class CustomNodesManager {
|
||||
self.grid.updateCell(item, "action");
|
||||
self.grid.setRowSelected(item, false);
|
||||
}
|
||||
else if(event.detail.status == 'done') {
|
||||
else if(event.detail.status == 'batch-done' && event.detail.batch_id == self.batch_id) {
|
||||
self.hideStop();
|
||||
self.onQueueCompleted(event.detail);
|
||||
}
|
||||
@@ -1624,17 +1626,35 @@ export class CustomNodesManager {
|
||||
getNodesInWorkflow() {
|
||||
let usedGroupNodes = new Set();
|
||||
let allUsedNodes = {};
|
||||
const visitedGraphs = new Set();
|
||||
|
||||
for(let k in app.graph._nodes) {
|
||||
let node = app.graph._nodes[k];
|
||||
const visitGraph = (graph) => {
|
||||
if (!graph || visitedGraphs.has(graph)) return;
|
||||
visitedGraphs.add(graph);
|
||||
|
||||
if(node.type.startsWith('workflow>')) {
|
||||
usedGroupNodes.add(node.type.slice(9));
|
||||
continue;
|
||||
const nodes = graph._nodes || graph.nodes || [];
|
||||
for(let k in nodes) {
|
||||
let node = nodes[k];
|
||||
if (!node) continue;
|
||||
|
||||
// If it's a SubgraphNode, recurse into its graph and continue searching
|
||||
if (node.isSubgraphNode?.() && node.subgraph) {
|
||||
visitGraph(node.subgraph);
|
||||
}
|
||||
|
||||
if (!node.type) continue;
|
||||
|
||||
// Group nodes / components
|
||||
if(typeof node.type === 'string' && node.type.startsWith('workflow>')) {
|
||||
usedGroupNodes.add(node.type.slice(9));
|
||||
continue;
|
||||
}
|
||||
|
||||
allUsedNodes[node.type] = node;
|
||||
}
|
||||
};
|
||||
|
||||
allUsedNodes[node.type] = node;
|
||||
}
|
||||
visitGraph(app.graph);
|
||||
|
||||
for(let k of usedGroupNodes) {
|
||||
let subnodes = app.graph.extra.groupNodes[k]?.nodes;
|
||||
@@ -1745,7 +1765,7 @@ export class CustomNodesManager {
|
||||
async getMissingNodesLegacy(hashMap, missing_nodes) {
|
||||
const mode = manager_instance.datasrc_combo.value;
|
||||
this.showStatus(`Loading missing nodes (${mode}) ...`);
|
||||
const res = await fetchData(`/customnode/getmappings?mode=${mode}`);
|
||||
const res = await fetchData(`/v2/customnode/getmappings?mode=${mode}`);
|
||||
if (res.error) {
|
||||
this.showError(`Failed to get custom node mappings: ${res.error}`);
|
||||
return;
|
||||
@@ -1860,7 +1880,7 @@ export class CustomNodesManager {
|
||||
async getAlternatives() {
|
||||
const mode = manager_instance.datasrc_combo.value;
|
||||
this.showStatus(`Loading alternatives (${mode}) ...`);
|
||||
const res = await fetchData(`/customnode/alternatives?mode=${mode}`);
|
||||
const res = await fetchData(`/v2/customnode/alternatives?mode=${mode}`);
|
||||
if (res.error) {
|
||||
this.showError(`Failed to get alternatives: ${res.error}`);
|
||||
return [];
|
||||
@@ -1908,7 +1928,7 @@ export class CustomNodesManager {
|
||||
infoToast('Fetching updated information. This may take some time if many custom nodes are installed.');
|
||||
}
|
||||
|
||||
const res = await fetchData(`/customnode/getlist?mode=${mode}${skip_update}`);
|
||||
const res = await fetchData(`/v2/customnode/getlist?mode=${mode}${skip_update}`);
|
||||
if (res.error) {
|
||||
this.showError("Failed to get custom node list.");
|
||||
this.hideLoading();
|
||||
@@ -3,7 +3,7 @@ import { $el } from "../../scripts/ui.js";
|
||||
import {
|
||||
manager_instance, rebootAPI,
|
||||
fetchData, md5, icons, show_message, customAlert, infoToast, showTerminal,
|
||||
storeColumnWidth, restoreColumnWidth, loadCss
|
||||
storeColumnWidth, restoreColumnWidth, loadCss, generateUUID
|
||||
} from "./common.js";
|
||||
import { api } from "../../scripts/api.js";
|
||||
|
||||
@@ -81,10 +81,13 @@ export class ModelManager {
|
||||
value: ""
|
||||
}, {
|
||||
label: "Installed",
|
||||
value: "True"
|
||||
value: "installed"
|
||||
}, {
|
||||
label: "Not Installed",
|
||||
value: "False"
|
||||
value: "not_installed"
|
||||
}, {
|
||||
label: "In Workflow",
|
||||
value: "in_workflow"
|
||||
}];
|
||||
|
||||
this.typeList = [{
|
||||
@@ -172,7 +175,7 @@ export class ModelManager {
|
||||
|
||||
".cmm-manager-stop": {
|
||||
click: () => {
|
||||
api.fetchApi('/manager/queue/reset');
|
||||
api.fetchApi('/v2/manager/queue/reset');
|
||||
infoToast('Cancel', 'Remaining tasks will stop after completing the current task.');
|
||||
}
|
||||
},
|
||||
@@ -254,12 +257,31 @@ export class ModelManager {
|
||||
rowFilter: (rowItem) => {
|
||||
|
||||
const searchableColumns = ["name", "type", "base", "description", "filename", "save_path"];
|
||||
const models_extensions = ['.ckpt', '.pt', '.pt2', '.bin', '.pth', '.safetensors', '.pkl', '.sft'];
|
||||
|
||||
let shouldShown = grid.highlightKeywordsFilter(rowItem, searchableColumns, this.keywords);
|
||||
|
||||
if (shouldShown) {
|
||||
if(this.filter && rowItem.installed !== this.filter) {
|
||||
return false;
|
||||
if(this.filter) {
|
||||
if (this.filter == "in_workflow") {
|
||||
rowItem.in_workflow = null;
|
||||
if (Array.isArray(app.graph._nodes)) {
|
||||
app.graph._nodes.forEach((item, i) => {
|
||||
if (Array.isArray(item.widgets_values)) {
|
||||
item.widgets_values.forEach((_item, i) => {
|
||||
if (rowItem.in_workflow === null && _item !== null && models_extensions.includes("." + _item.toString().split('.').pop())) {
|
||||
let filename = _item.match(/([^\/]+)(?=\.\w+$)/)[0];
|
||||
if (grid.highlightKeywordsFilter(rowItem, searchableColumns, filename)) {
|
||||
rowItem.in_workflow = "True";
|
||||
grid.highlightKeywordsFilter(rowItem, searchableColumns, "");
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
return ((this.filter == "installed" && rowItem.installed == "True") || (this.filter == "not_installed" && rowItem.installed == "False") || (this.filter == "in_workflow" && rowItem.in_workflow == "True"));
|
||||
}
|
||||
|
||||
if(this.type && rowItem.type !== this.type) {
|
||||
@@ -413,24 +435,16 @@ export class ModelManager {
|
||||
}
|
||||
|
||||
async installModels(list, btn) {
|
||||
let stats = await api.fetchApi('/manager/queue/status');
|
||||
|
||||
stats = await stats.json();
|
||||
if(stats.is_processing) {
|
||||
customAlert(`[ComfyUI-Manager] There are already tasks in progress. Please try again after it is completed. (${stats.done_count}/${stats.total_count})`);
|
||||
return;
|
||||
}
|
||||
|
||||
btn.classList.add("cmm-btn-loading");
|
||||
this.showError("");
|
||||
|
||||
let needRefresh = false;
|
||||
let errorMsg = "";
|
||||
|
||||
await api.fetchApi('/manager/queue/reset');
|
||||
|
||||
let target_items = [];
|
||||
|
||||
let batch = {};
|
||||
|
||||
for (const item of list) {
|
||||
this.grid.scrollRowIntoView(item);
|
||||
target_items.push(item);
|
||||
@@ -446,21 +460,12 @@ export class ModelManager {
|
||||
const data = item.originalData;
|
||||
data.ui_id = item.hash;
|
||||
|
||||
const res = await api.fetchApi(`/manager/queue/install_model`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(data)
|
||||
});
|
||||
|
||||
if (res.status != 200) {
|
||||
errorMsg = `'${item.name}': `;
|
||||
|
||||
if(res.status == 403) {
|
||||
errorMsg += `This action is not allowed with this security level configuration.\n`;
|
||||
} else {
|
||||
errorMsg += await res.text() + '\n';
|
||||
}
|
||||
|
||||
break;
|
||||
if(batch['install_model']) {
|
||||
batch['install_model'].push(data);
|
||||
}
|
||||
else {
|
||||
batch['install_model'] = [data];
|
||||
}
|
||||
}
|
||||
|
||||
@@ -477,7 +482,24 @@ export class ModelManager {
|
||||
}
|
||||
}
|
||||
else {
|
||||
await api.fetchApi('/manager/queue/start');
|
||||
this.batch_id = generateUUID();
|
||||
batch['batch_id'] = this.batch_id;
|
||||
|
||||
const res = await api.fetchApi(`/v2/manager/queue/batch`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(batch)
|
||||
});
|
||||
|
||||
let failed = await res.json();
|
||||
|
||||
if(failed.length > 0) {
|
||||
for(let k in failed) {
|
||||
let hash = failed[k];
|
||||
const item = self.grid.getRowItemBy("hash", hash);
|
||||
errorMsg = `[FAIL] ${item.title}`;
|
||||
}
|
||||
}
|
||||
|
||||
this.showStop();
|
||||
showTerminal();
|
||||
}
|
||||
@@ -497,7 +519,7 @@ export class ModelManager {
|
||||
// self.grid.updateCell(item, "tg-column-select");
|
||||
self.grid.updateRow(item);
|
||||
}
|
||||
else if(event.detail.status == 'done') {
|
||||
else if(event.detail.status == 'batch-done') {
|
||||
self.hideStop();
|
||||
self.onQueueCompleted(event.detail);
|
||||
}
|
||||
@@ -623,7 +645,7 @@ export class ModelManager {
|
||||
|
||||
const mode = manager_instance.datasrc_combo.value;
|
||||
|
||||
const res = await fetchData(`/externalmodel/getlist?mode=${mode}`);
|
||||
const res = await fetchData(`/v2/externalmodel/getlist?mode=${mode}`);
|
||||
if (res.error) {
|
||||
this.showError("Failed to get external model list.");
|
||||
this.hideLoading();
|
||||
@@ -795,4 +817,4 @@ export class ModelManager {
|
||||
close() {
|
||||
this.element.style.display = "none";
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -142,7 +142,7 @@ function node_info_copy(src, dest, connect_both, copy_shape) {
|
||||
}
|
||||
|
||||
app.registerExtension({
|
||||
name: "Comfy.Manager.NodeFixer",
|
||||
name: "Comfy.Legacy.Manager.NodeFixer",
|
||||
beforeRegisterNodeDef(nodeType, nodeData, app) {
|
||||
addMenuHandler(nodeType, function (_, options) {
|
||||
options.push({
|
||||
@@ -153,6 +153,7 @@ app.registerExtension({
|
||||
app.canvas.graph.add(new_node, false);
|
||||
node_info_copy(this, new_node, true);
|
||||
app.canvas.graph.remove(this);
|
||||
requestAnimationFrame(() => app.canvas.setDirty(true, true))
|
||||
},
|
||||
});
|
||||
});
|
||||
@@ -7,7 +7,7 @@ import { manager_instance, rebootAPI, show_message } from "./common.js";
|
||||
async function restore_snapshot(target) {
|
||||
if(SnapshotManager.instance) {
|
||||
try {
|
||||
const response = await api.fetchApi(`/snapshot/restore?target=${target}`, { cache: "no-store" });
|
||||
const response = await api.fetchApi(`/v2/snapshot/restore?target=${target}`, { cache: "no-store" });
|
||||
|
||||
if(response.status == 403) {
|
||||
show_message('This action is not allowed with this security level configuration.');
|
||||
@@ -35,7 +35,7 @@ async function restore_snapshot(target) {
|
||||
async function remove_snapshot(target) {
|
||||
if(SnapshotManager.instance) {
|
||||
try {
|
||||
const response = await api.fetchApi(`/snapshot/remove?target=${target}`, { cache: "no-store" });
|
||||
const response = await api.fetchApi(`/v2/snapshot/remove?target=${target}`, { cache: "no-store" });
|
||||
|
||||
if(response.status == 403) {
|
||||
show_message('This action is not allowed with this security level configuration.');
|
||||
@@ -61,7 +61,7 @@ async function remove_snapshot(target) {
|
||||
|
||||
async function save_current_snapshot() {
|
||||
try {
|
||||
const response = await api.fetchApi('/snapshot/save', { cache: "no-store" });
|
||||
const response = await api.fetchApi('/v2/snapshot/save', { cache: "no-store" });
|
||||
app.ui.dialog.close();
|
||||
return true;
|
||||
}
|
||||
@@ -76,7 +76,7 @@ async function save_current_snapshot() {
|
||||
}
|
||||
|
||||
async function getSnapshotList() {
|
||||
const response = await api.fetchApi(`/snapshot/getlist`);
|
||||
const response = await api.fetchApi(`/v2/snapshot/getlist`);
|
||||
const data = await response.json();
|
||||
return data;
|
||||
}
|
||||
@@ -38,7 +38,7 @@ class WorkflowMetadataExtension {
|
||||
* enabled is true if the node is enabled, false if it is disabled
|
||||
*/
|
||||
async getInstalledNodes() {
|
||||
const res = await api.fetchApi("/customnode/installed");
|
||||
const res = await api.fetchApi("/v2/customnode/installed");
|
||||
return await res.json();
|
||||
}
|
||||
|
||||
@@ -71,7 +71,7 @@ class WorkflowMetadataExtension {
|
||||
if (cnr_id) nodeProperties.cnr_id = cnr_id;
|
||||
else nodeProperties.aux_id = aux_id;
|
||||
if (ver) nodeProperties.ver = ver.trim();
|
||||
} else if (["nodes", "comfy_extras"].includes(moduleType)) {
|
||||
} else if (["nodes", "comfy_extras", "comfy_api_nodes"].includes(moduleType)) {
|
||||
nodeProperties.cnr_id = "comfy-core";
|
||||
nodeProperties.ver = this.comfyCoreVersion;
|
||||
}
|
||||
0
comfyui_manager/legacy/__init__.py
Normal file
0
comfyui_manager/legacy/__init__.py
Normal file
@@ -23,7 +23,6 @@ import yaml
|
||||
import zipfile
|
||||
import traceback
|
||||
from concurrent.futures import ThreadPoolExecutor, as_completed
|
||||
import toml
|
||||
|
||||
orig_print = print
|
||||
|
||||
@@ -32,22 +31,22 @@ from packaging import version
|
||||
|
||||
import uuid
|
||||
|
||||
glob_path = os.path.join(os.path.dirname(__file__)) # ComfyUI-Manager/glob
|
||||
sys.path.append(glob_path)
|
||||
|
||||
import cm_global
|
||||
import cnr_utils
|
||||
import manager_util
|
||||
import git_utils
|
||||
import manager_downloader
|
||||
from node_package import InstalledNodePackage
|
||||
from ..common import cm_global
|
||||
from ..common import cnr_utils
|
||||
from ..common import manager_util
|
||||
from ..common import git_utils
|
||||
from ..common import manager_downloader
|
||||
from ..common.node_package import InstalledNodePackage
|
||||
from ..common.enums import NetworkMode, SecurityLevel, DBMode
|
||||
from ..common import context
|
||||
|
||||
|
||||
version_code = [3, 31, 12]
|
||||
version_code = [4, 0, 2]
|
||||
version_str = f"V{version_code[0]}.{version_code[1]}" + (f'.{version_code[2]}' if len(version_code) > 2 else '')
|
||||
|
||||
|
||||
DEFAULT_CHANNEL = "https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main"
|
||||
DEFAULT_CHANNEL = "https://raw.githubusercontent.com/Comfy-Org/ComfyUI-Manager/main"
|
||||
DEFAULT_CHANNEL_LEGACY = "https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main"
|
||||
|
||||
|
||||
default_custom_nodes_path = None
|
||||
@@ -58,13 +57,14 @@ class InvalidChannel(Exception):
|
||||
self.channel = channel
|
||||
super().__init__(channel)
|
||||
|
||||
|
||||
def get_default_custom_nodes_path():
|
||||
global default_custom_nodes_path
|
||||
if default_custom_nodes_path is None:
|
||||
try:
|
||||
import folder_paths
|
||||
default_custom_nodes_path = folder_paths.get_folder_paths("custom_nodes")[0]
|
||||
except:
|
||||
except Exception:
|
||||
default_custom_nodes_path = os.path.abspath(os.path.join(manager_util.comfyui_manager_path, '..'))
|
||||
|
||||
return default_custom_nodes_path
|
||||
@@ -74,37 +74,11 @@ def get_custom_nodes_paths():
|
||||
try:
|
||||
import folder_paths
|
||||
return folder_paths.get_folder_paths("custom_nodes")
|
||||
except:
|
||||
except Exception:
|
||||
custom_nodes_path = os.path.abspath(os.path.join(manager_util.comfyui_manager_path, '..'))
|
||||
return [custom_nodes_path]
|
||||
|
||||
|
||||
def get_comfyui_tag():
|
||||
try:
|
||||
repo = git.Repo(comfy_path)
|
||||
return repo.git.describe('--tags')
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def get_current_comfyui_ver():
|
||||
"""
|
||||
Extract version from pyproject.toml
|
||||
"""
|
||||
toml_path = os.path.join(comfy_path, 'pyproject.toml')
|
||||
if not os.path.exists(toml_path):
|
||||
return None
|
||||
else:
|
||||
try:
|
||||
with open(toml_path, "r", encoding="utf-8") as f:
|
||||
data = toml.load(f)
|
||||
|
||||
project = data.get('project', {})
|
||||
return project.get('version')
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def get_script_env():
|
||||
new_env = os.environ.copy()
|
||||
git_exe = get_config().get('git_exe')
|
||||
@@ -112,10 +86,10 @@ def get_script_env():
|
||||
new_env['GIT_EXE_PATH'] = git_exe
|
||||
|
||||
if 'COMFYUI_PATH' not in new_env:
|
||||
new_env['COMFYUI_PATH'] = comfy_path
|
||||
new_env['COMFYUI_PATH'] = context.comfy_path
|
||||
|
||||
if 'COMFYUI_FOLDERS_BASE_PATH' not in new_env:
|
||||
new_env['COMFYUI_FOLDERS_BASE_PATH'] = comfy_path
|
||||
new_env['COMFYUI_FOLDERS_BASE_PATH'] = context.comfy_path
|
||||
|
||||
return new_env
|
||||
|
||||
@@ -137,12 +111,12 @@ def check_invalid_nodes():
|
||||
|
||||
try:
|
||||
import folder_paths
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
sys.path.append(comfy_path)
|
||||
sys.path.append(context.comfy_path)
|
||||
import folder_paths
|
||||
except:
|
||||
raise Exception(f"Invalid COMFYUI_FOLDERS_BASE_PATH: {comfy_path}")
|
||||
except Exception:
|
||||
raise Exception(f"Invalid COMFYUI_FOLDERS_BASE_PATH: {context.comfy_path}")
|
||||
|
||||
def check(root):
|
||||
global invalid_nodes
|
||||
@@ -177,75 +151,6 @@ def check_invalid_nodes():
|
||||
print("\n---------------------------------------------------------------------------\n")
|
||||
|
||||
|
||||
# read env vars
|
||||
comfy_path: str = os.environ.get('COMFYUI_PATH')
|
||||
comfy_base_path = os.environ.get('COMFYUI_FOLDERS_BASE_PATH')
|
||||
|
||||
if comfy_path is None:
|
||||
try:
|
||||
import folder_paths
|
||||
comfy_path = os.path.join(os.path.dirname(folder_paths.__file__))
|
||||
except:
|
||||
comfy_path = os.path.abspath(os.path.join(manager_util.comfyui_manager_path, '..', '..'))
|
||||
|
||||
if comfy_base_path is None:
|
||||
comfy_base_path = comfy_path
|
||||
|
||||
|
||||
channel_list_template_path = os.path.join(manager_util.comfyui_manager_path, 'channels.list.template')
|
||||
git_script_path = os.path.join(manager_util.comfyui_manager_path, "git_helper.py")
|
||||
|
||||
manager_files_path = None
|
||||
manager_config_path = None
|
||||
manager_channel_list_path = None
|
||||
manager_startup_script_path:str = None
|
||||
manager_snapshot_path = None
|
||||
manager_pip_overrides_path = None
|
||||
manager_pip_blacklist_path = None
|
||||
manager_components_path = None
|
||||
|
||||
def update_user_directory(user_dir):
|
||||
global manager_files_path
|
||||
global manager_config_path
|
||||
global manager_channel_list_path
|
||||
global manager_startup_script_path
|
||||
global manager_snapshot_path
|
||||
global manager_pip_overrides_path
|
||||
global manager_pip_blacklist_path
|
||||
global manager_components_path
|
||||
|
||||
manager_files_path = os.path.abspath(os.path.join(user_dir, 'default', 'ComfyUI-Manager'))
|
||||
if not os.path.exists(manager_files_path):
|
||||
os.makedirs(manager_files_path)
|
||||
|
||||
manager_snapshot_path = os.path.join(manager_files_path, "snapshots")
|
||||
if not os.path.exists(manager_snapshot_path):
|
||||
os.makedirs(manager_snapshot_path)
|
||||
|
||||
manager_startup_script_path = os.path.join(manager_files_path, "startup-scripts")
|
||||
if not os.path.exists(manager_startup_script_path):
|
||||
os.makedirs(manager_startup_script_path)
|
||||
|
||||
manager_config_path = os.path.join(manager_files_path, 'config.ini')
|
||||
manager_channel_list_path = os.path.join(manager_files_path, 'channels.list')
|
||||
manager_pip_overrides_path = os.path.join(manager_files_path, "pip_overrides.json")
|
||||
manager_pip_blacklist_path = os.path.join(manager_files_path, "pip_blacklist.list")
|
||||
manager_components_path = os.path.join(manager_files_path, "components")
|
||||
manager_util.cache_dir = os.path.join(manager_files_path, "cache")
|
||||
|
||||
if not os.path.exists(manager_util.cache_dir):
|
||||
os.makedirs(manager_util.cache_dir)
|
||||
|
||||
try:
|
||||
import folder_paths
|
||||
update_user_directory(folder_paths.get_user_directory())
|
||||
|
||||
except Exception:
|
||||
# fallback:
|
||||
# This case is only possible when running with cm-cli, and in practice, this case is not actually used.
|
||||
update_user_directory(os.path.abspath(manager_util.comfyui_manager_path))
|
||||
|
||||
|
||||
cached_config = None
|
||||
js_path = None
|
||||
|
||||
@@ -256,7 +161,7 @@ comfy_ui_revision = "Unknown"
|
||||
comfy_ui_commit_datetime = datetime(1900, 1, 1, 0, 0, 0)
|
||||
|
||||
channel_dict = None
|
||||
valid_channels = {'default', 'local'}
|
||||
valid_channels = {'default', 'local', DEFAULT_CHANNEL, DEFAULT_CHANNEL_LEGACY}
|
||||
channel_list = None
|
||||
|
||||
|
||||
@@ -400,18 +305,86 @@ class ManagedResult:
|
||||
return self
|
||||
|
||||
|
||||
class NormalizedKeyDict:
|
||||
def __init__(self):
|
||||
self._store = {}
|
||||
self._key_map = {}
|
||||
|
||||
def _normalize_key(self, key):
|
||||
if isinstance(key, str):
|
||||
return key.strip().lower()
|
||||
return key
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
norm_key = self._normalize_key(key)
|
||||
self._key_map[norm_key] = key
|
||||
self._store[key] = value
|
||||
|
||||
def __getitem__(self, key):
|
||||
norm_key = self._normalize_key(key)
|
||||
original_key = self._key_map[norm_key]
|
||||
return self._store[original_key]
|
||||
|
||||
def __delitem__(self, key):
|
||||
norm_key = self._normalize_key(key)
|
||||
original_key = self._key_map.pop(norm_key)
|
||||
del self._store[original_key]
|
||||
|
||||
def __contains__(self, key):
|
||||
return self._normalize_key(key) in self._key_map
|
||||
|
||||
def get(self, key, default=None):
|
||||
return self[key] if key in self else default
|
||||
|
||||
def setdefault(self, key, default=None):
|
||||
if key in self:
|
||||
return self[key]
|
||||
self[key] = default
|
||||
return default
|
||||
|
||||
def pop(self, key, default=None):
|
||||
if key in self:
|
||||
val = self[key]
|
||||
del self[key]
|
||||
return val
|
||||
if default is not None:
|
||||
return default
|
||||
raise KeyError(key)
|
||||
|
||||
def keys(self):
|
||||
return self._store.keys()
|
||||
|
||||
def values(self):
|
||||
return self._store.values()
|
||||
|
||||
def items(self):
|
||||
return self._store.items()
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self._store)
|
||||
|
||||
def __len__(self):
|
||||
return len(self._store)
|
||||
|
||||
def __repr__(self):
|
||||
return repr(self._store)
|
||||
|
||||
def to_dict(self):
|
||||
return dict(self._store)
|
||||
|
||||
|
||||
class UnifiedManager:
|
||||
def __init__(self):
|
||||
self.installed_node_packages: dict[str, InstalledNodePackage] = {}
|
||||
|
||||
self.cnr_inactive_nodes = {} # node_id -> node_version -> fullpath
|
||||
self.nightly_inactive_nodes = {} # node_id -> fullpath
|
||||
self.unknown_inactive_nodes = {} # node_id -> repo url * fullpath
|
||||
self.active_nodes = {} # node_id -> node_version * fullpath
|
||||
self.unknown_active_nodes = {} # node_id -> repo url * fullpath
|
||||
self.cnr_map = {} # node_id -> cnr info
|
||||
self.repo_cnr_map = {} # repo_url -> cnr info
|
||||
self.custom_node_map_cache = {} # (channel, mode) -> augmented custom node list json
|
||||
self.cnr_inactive_nodes = NormalizedKeyDict() # node_id -> node_version -> fullpath
|
||||
self.nightly_inactive_nodes = NormalizedKeyDict() # node_id -> fullpath
|
||||
self.unknown_inactive_nodes = {} # node_id -> repo url * fullpath
|
||||
self.active_nodes = NormalizedKeyDict() # node_id -> node_version * fullpath
|
||||
self.unknown_active_nodes = {} # node_id -> repo url * fullpath
|
||||
self.cnr_map = NormalizedKeyDict() # node_id -> cnr info
|
||||
self.repo_cnr_map = {} # repo_url -> cnr info
|
||||
self.custom_node_map_cache = {} # (channel, mode) -> augmented custom node list json
|
||||
self.processed_install = set()
|
||||
|
||||
def get_module_name(self, x):
|
||||
@@ -553,7 +526,7 @@ class UnifiedManager:
|
||||
ver = str(manager_util.StrictVersion(info['version']))
|
||||
return {'id': cnr['id'], 'cnr': cnr, 'ver': ver}
|
||||
else:
|
||||
return None
|
||||
return {'id': info['id'], 'ver': info['version']}
|
||||
else:
|
||||
return None
|
||||
|
||||
@@ -729,7 +702,9 @@ class UnifiedManager:
|
||||
|
||||
return latest
|
||||
|
||||
async def reload(self, cache_mode, dont_wait=True):
|
||||
async def reload(self, cache_mode, dont_wait=True, update_cnr_map=True):
|
||||
import folder_paths
|
||||
|
||||
self.custom_node_map_cache = {}
|
||||
self.cnr_inactive_nodes = {} # node_id -> node_version -> fullpath
|
||||
self.nightly_inactive_nodes = {} # node_id -> fullpath
|
||||
@@ -737,17 +712,18 @@ class UnifiedManager:
|
||||
self.unknown_active_nodes = {} # node_id -> repo url * fullpath
|
||||
self.active_nodes = {} # node_id -> node_version * fullpath
|
||||
|
||||
if get_config()['network_mode'] != 'public':
|
||||
if get_config()['network_mode'] != 'public' or manager_util.is_manager_pip_package():
|
||||
dont_wait = True
|
||||
|
||||
# reload 'cnr_map' and 'repo_cnr_map'
|
||||
cnrs = await cnr_utils.get_cnr_data(cache_mode=cache_mode=='cache', dont_wait=dont_wait)
|
||||
if update_cnr_map:
|
||||
# reload 'cnr_map' and 'repo_cnr_map'
|
||||
cnrs = await cnr_utils.get_cnr_data(cache_mode=cache_mode=='cache', dont_wait=dont_wait)
|
||||
|
||||
for x in cnrs:
|
||||
self.cnr_map[x['id']] = x
|
||||
if 'repository' in x:
|
||||
normalized_url = git_utils.normalize_url(x['repository'])
|
||||
self.repo_cnr_map[normalized_url] = x
|
||||
for x in cnrs:
|
||||
self.cnr_map[x['id']] = x
|
||||
if 'repository' in x:
|
||||
normalized_url = git_utils.normalize_url(x['repository'])
|
||||
self.repo_cnr_map[normalized_url] = x
|
||||
|
||||
# reload node status info from custom_nodes/*
|
||||
for custom_nodes_path in folder_paths.get_folder_paths('custom_nodes'):
|
||||
@@ -795,7 +771,7 @@ class UnifiedManager:
|
||||
if 'id' in x:
|
||||
if x['id'] not in res:
|
||||
res[x['id']] = (x, True)
|
||||
except:
|
||||
except Exception:
|
||||
logging.error(f"[ComfyUI-Manager] broken item:{x}")
|
||||
|
||||
return res
|
||||
@@ -814,7 +790,7 @@ class UnifiedManager:
|
||||
channel = normalize_channel(channel)
|
||||
nodes = await self.load_nightly(channel, mode)
|
||||
|
||||
res = {}
|
||||
res = NormalizedKeyDict()
|
||||
added_cnr = set()
|
||||
for v in nodes.values():
|
||||
v = v[0]
|
||||
@@ -848,7 +824,7 @@ class UnifiedManager:
|
||||
def safe_version(ver_str):
|
||||
try:
|
||||
return version.parse(ver_str)
|
||||
except:
|
||||
except Exception:
|
||||
return version.parse("0.0.0")
|
||||
|
||||
def execute_install_script(self, url, repo_path, instant_execution=False, lazy_mode=False, no_deps=False):
|
||||
@@ -862,7 +838,7 @@ class UnifiedManager:
|
||||
else:
|
||||
if os.path.exists(requirements_path) and not no_deps:
|
||||
print("Install: pip packages")
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), context.comfy_path, context.manager_files_path)
|
||||
lines = manager_util.robust_readlines(requirements_path)
|
||||
for line in lines:
|
||||
package_name = remap_pip_package(line.strip())
|
||||
@@ -883,7 +859,7 @@ class UnifiedManager:
|
||||
return res
|
||||
|
||||
def reserve_cnr_switch(self, target, zip_url, from_path, to_path, no_deps):
|
||||
script_path = os.path.join(manager_startup_script_path, "install-scripts.txt")
|
||||
script_path = os.path.join(context.manager_startup_script_path, "install-scripts.txt")
|
||||
with open(script_path, "a") as file:
|
||||
obj = [target, "#LAZY-CNR-SWITCH-SCRIPT", zip_url, from_path, to_path, no_deps, get_default_custom_nodes_path(), sys.executable]
|
||||
file.write(f"{obj}\n")
|
||||
@@ -1289,7 +1265,7 @@ class UnifiedManager:
|
||||
print(f"Download: git clone '{clone_url}'")
|
||||
|
||||
if not instant_execution and platform.system() == 'Windows':
|
||||
res = manager_funcs.run_script([sys.executable, git_script_path, "--clone", get_default_custom_nodes_path(), clone_url, repo_path], cwd=get_default_custom_nodes_path())
|
||||
res = manager_funcs.run_script([sys.executable, context.git_script_path, "--clone", get_default_custom_nodes_path(), clone_url, repo_path], cwd=get_default_custom_nodes_path())
|
||||
if res != 0:
|
||||
return result.fail(f"Failed to clone repo: {clone_url}")
|
||||
else:
|
||||
@@ -1415,6 +1391,7 @@ class UnifiedManager:
|
||||
return ManagedResult('skip')
|
||||
elif self.is_disabled(node_id):
|
||||
return self.unified_enable(node_id)
|
||||
|
||||
else:
|
||||
version_spec = self.resolve_unspecified_version(node_id)
|
||||
|
||||
@@ -1441,12 +1418,20 @@ class UnifiedManager:
|
||||
return self.unified_enable(node_id, version_spec)
|
||||
|
||||
elif version_spec == 'unknown' or version_spec == 'nightly':
|
||||
to_path = os.path.abspath(os.path.join(get_default_custom_nodes_path(), node_id))
|
||||
|
||||
if version_spec == 'nightly':
|
||||
# disable cnr nodes
|
||||
if self.is_enabled(node_id, 'cnr'):
|
||||
self.unified_disable(node_id, False)
|
||||
|
||||
to_path = os.path.abspath(os.path.join(get_default_custom_nodes_path(), node_id))
|
||||
# use `repo name` as a dir name instead of `cnr id` if system added nodepack (i.e. publisher is null)
|
||||
cnr = self.cnr_map.get(node_id)
|
||||
|
||||
if cnr is not None and cnr.get('publisher') is None:
|
||||
repo_name = os.path.basename(git_utils.normalize_url(repo_url))
|
||||
to_path = os.path.abspath(os.path.join(get_default_custom_nodes_path(), repo_name))
|
||||
|
||||
res = self.repo_install(repo_url, to_path, instant_execution=instant_execution, no_deps=no_deps, return_postinstall=return_postinstall)
|
||||
if res.result:
|
||||
if version_spec == 'unknown':
|
||||
@@ -1507,7 +1492,7 @@ def identify_node_pack_from_path(fullpath):
|
||||
if github_id is None:
|
||||
try:
|
||||
github_id = os.path.basename(repo_url)
|
||||
except:
|
||||
except Exception:
|
||||
logging.warning(f"[ComfyUI-Manager] unexpected repo url: {repo_url}")
|
||||
github_id = module_name
|
||||
|
||||
@@ -1562,10 +1547,10 @@ def get_channel_dict():
|
||||
if channel_dict is None:
|
||||
channel_dict = {}
|
||||
|
||||
if not os.path.exists(manager_channel_list_path):
|
||||
shutil.copy(channel_list_template_path, manager_channel_list_path)
|
||||
if not os.path.exists(context.manager_channel_list_path):
|
||||
shutil.copy(context.channel_list_template_path, context.manager_channel_list_path)
|
||||
|
||||
with open(manager_channel_list_path, 'r') as file:
|
||||
with open(context.manager_channel_list_path, 'r') as file:
|
||||
channels = file.read()
|
||||
for x in channels.split('\n'):
|
||||
channel_info = x.split("::")
|
||||
@@ -1629,29 +1614,31 @@ def write_config():
|
||||
'db_mode': get_config()['db_mode'],
|
||||
}
|
||||
|
||||
directory = os.path.dirname(manager_config_path)
|
||||
directory = os.path.dirname(context.manager_config_path)
|
||||
if not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
|
||||
with open(manager_config_path, 'w') as configfile:
|
||||
with open(context.manager_config_path, 'w') as configfile:
|
||||
config.write(configfile)
|
||||
|
||||
|
||||
def read_config():
|
||||
try:
|
||||
config = configparser.ConfigParser(strict=False)
|
||||
config.read(manager_config_path)
|
||||
config.read(context.manager_config_path)
|
||||
default_conf = config['default']
|
||||
manager_util.use_uv = default_conf['use_uv'].lower() == 'true' if 'use_uv' in default_conf else False
|
||||
|
||||
def get_bool(key, default_value):
|
||||
return default_conf[key].lower() == 'true' if key in default_conf else False
|
||||
|
||||
manager_util.use_uv = default_conf['use_uv'].lower() == 'true' if 'use_uv' in default_conf else False
|
||||
manager_util.bypass_ssl = get_bool('bypass_ssl', False)
|
||||
|
||||
return {
|
||||
'http_channel_enabled': get_bool('http_channel_enabled', False),
|
||||
'preview_method': default_conf.get('preview_method', manager_funcs.get_current_preview_method()).lower(),
|
||||
'git_exe': default_conf.get('git_exe', ''),
|
||||
'use_uv': get_bool('use_uv', False),
|
||||
'use_uv': get_bool('use_uv', True),
|
||||
'channel_url': default_conf.get('channel_url', DEFAULT_CHANNEL),
|
||||
'default_cache_as_channel_url': get_bool('default_cache_as_channel_url', False),
|
||||
'share_option': default_conf.get('share_option', 'all').lower(),
|
||||
@@ -1663,22 +1650,24 @@ def read_config():
|
||||
'model_download_by_agent': get_bool('model_download_by_agent', False),
|
||||
'downgrade_blacklist': default_conf.get('downgrade_blacklist', '').lower(),
|
||||
'always_lazy_install': get_bool('always_lazy_install', False),
|
||||
'network_mode': default_conf.get('network_mode', 'public').lower(),
|
||||
'security_level': default_conf.get('security_level', 'normal').lower(),
|
||||
'db_mode': default_conf.get('db_mode', 'cache').lower(),
|
||||
'network_mode': default_conf.get('network_mode', NetworkMode.PUBLIC.value).lower(),
|
||||
'security_level': default_conf.get('security_level', SecurityLevel.NORMAL.value).lower(),
|
||||
'db_mode': default_conf.get('db_mode', DBMode.CACHE.value).lower(),
|
||||
}
|
||||
|
||||
except Exception:
|
||||
manager_util.use_uv = False
|
||||
manager_util.bypass_ssl = False
|
||||
|
||||
return {
|
||||
'http_channel_enabled': False,
|
||||
'preview_method': manager_funcs.get_current_preview_method(),
|
||||
'git_exe': '',
|
||||
'use_uv': False,
|
||||
'use_uv': True,
|
||||
'channel_url': DEFAULT_CHANNEL,
|
||||
'default_cache_as_channel_url': False,
|
||||
'share_option': 'all',
|
||||
'bypass_ssl': False,
|
||||
'bypass_ssl': manager_util.bypass_ssl,
|
||||
'file_logging': True,
|
||||
'component_policy': 'workflow',
|
||||
'update_policy': 'stable-comfyui',
|
||||
@@ -1686,9 +1675,9 @@ def read_config():
|
||||
'model_download_by_agent': False,
|
||||
'downgrade_blacklist': '',
|
||||
'always_lazy_install': False,
|
||||
'network_mode': 'public', # public | private | offline
|
||||
'security_level': 'normal', # strong | normal | normal- | weak
|
||||
'db_mode': 'cache', # local | cache | remote
|
||||
'network_mode': NetworkMode.PUBLIC.value,
|
||||
'security_level': SecurityLevel.NORMAL.value,
|
||||
'db_mode': DBMode.CACHE.value,
|
||||
}
|
||||
|
||||
|
||||
@@ -1732,27 +1721,27 @@ def switch_to_default_branch(repo):
|
||||
default_branch = repo.git.symbolic_ref(f'refs/remotes/{remote_name}/HEAD').replace(f'refs/remotes/{remote_name}/', '')
|
||||
repo.git.checkout(default_branch)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
# try checkout master
|
||||
# try checkout main if failed
|
||||
try:
|
||||
repo.git.checkout(repo.heads.master)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
if remote_name is not None:
|
||||
repo.git.checkout('-b', 'master', f'{remote_name}/master')
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
repo.git.checkout(repo.heads.main)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
try:
|
||||
if remote_name is not None:
|
||||
repo.git.checkout('-b', 'main', f'{remote_name}/main')
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
print("[ComfyUI Manager] Failed to switch to the default branch")
|
||||
@@ -1760,10 +1749,10 @@ def switch_to_default_branch(repo):
|
||||
|
||||
|
||||
def reserve_script(repo_path, install_cmds):
|
||||
if not os.path.exists(manager_startup_script_path):
|
||||
os.makedirs(manager_startup_script_path)
|
||||
if not os.path.exists(context.manager_startup_script_path):
|
||||
os.makedirs(context.manager_startup_script_path)
|
||||
|
||||
script_path = os.path.join(manager_startup_script_path, "install-scripts.txt")
|
||||
script_path = os.path.join(context.manager_startup_script_path, "install-scripts.txt")
|
||||
with open(script_path, "a") as file:
|
||||
obj = [repo_path] + install_cmds
|
||||
file.write(f"{obj}\n")
|
||||
@@ -1803,7 +1792,7 @@ def try_install_script(url, repo_path, install_cmd, instant_execution=False):
|
||||
print(f"[WARN] ComfyUI-Manager: Your ComfyUI version ({comfy_ui_revision})[{comfy_ui_commit_datetime.date()}] is too old. Please update to the latest version.")
|
||||
print("[WARN] The extension installation feature may not work properly in the current installed ComfyUI version on Windows environment.")
|
||||
print("###################################################################\n\n")
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if code != 0:
|
||||
@@ -1818,11 +1807,11 @@ def try_install_script(url, repo_path, install_cmd, instant_execution=False):
|
||||
# use subprocess to avoid file system lock by git (Windows)
|
||||
def __win_check_git_update(path, do_fetch=False, do_update=False):
|
||||
if do_fetch:
|
||||
command = [sys.executable, git_script_path, "--fetch", path]
|
||||
command = [sys.executable, context.git_script_path, "--fetch", path]
|
||||
elif do_update:
|
||||
command = [sys.executable, git_script_path, "--pull", path]
|
||||
command = [sys.executable, context.git_script_path, "--pull", path]
|
||||
else:
|
||||
command = [sys.executable, git_script_path, "--check", path]
|
||||
command = [sys.executable, context.git_script_path, "--check", path]
|
||||
|
||||
new_env = get_script_env()
|
||||
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=get_default_custom_nodes_path(), env=new_env)
|
||||
@@ -1876,7 +1865,7 @@ def __win_check_git_update(path, do_fetch=False, do_update=False):
|
||||
|
||||
|
||||
def __win_check_git_pull(path):
|
||||
command = [sys.executable, git_script_path, "--pull", path]
|
||||
command = [sys.executable, context.git_script_path, "--pull", path]
|
||||
process = subprocess.Popen(command, env=get_script_env(), cwd=get_default_custom_nodes_path())
|
||||
process.wait()
|
||||
|
||||
@@ -1892,7 +1881,7 @@ def execute_install_script(url, repo_path, lazy_mode=False, instant_execution=Fa
|
||||
else:
|
||||
if os.path.exists(requirements_path) and not no_deps:
|
||||
print("Install: pip packages")
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), comfy_path, manager_files_path)
|
||||
pip_fixer = manager_util.PIPFixer(manager_util.get_installed_packages(), context.comfy_path, context.manager_files_path)
|
||||
with open(requirements_path, "r") as requirements_file:
|
||||
for line in requirements_file:
|
||||
#handle comments
|
||||
@@ -2118,7 +2107,7 @@ async def gitclone_install(url, instant_execution=False, msg_prefix='', no_deps=
|
||||
clone_url = git_utils.get_url_for_clone(url)
|
||||
|
||||
if not instant_execution and platform.system() == 'Windows':
|
||||
res = manager_funcs.run_script([sys.executable, git_script_path, "--clone", get_default_custom_nodes_path(), clone_url, repo_path], cwd=get_default_custom_nodes_path())
|
||||
res = manager_funcs.run_script([sys.executable, context.git_script_path, "--clone", get_default_custom_nodes_path(), clone_url, repo_path], cwd=get_default_custom_nodes_path())
|
||||
if res != 0:
|
||||
return result.fail(f"Failed to clone '{clone_url}' into '{repo_path}'")
|
||||
else:
|
||||
@@ -2185,7 +2174,7 @@ async def get_data_by_mode(mode, filename, channel_url=None):
|
||||
cache_uri = str(manager_util.simple_hash(uri))+'_'+filename
|
||||
cache_uri = os.path.join(manager_util.cache_dir, cache_uri)
|
||||
|
||||
if get_config()['network_mode'] == 'offline':
|
||||
if get_config()['network_mode'] == 'offline' or manager_util.is_manager_pip_package():
|
||||
# offline network mode
|
||||
if os.path.exists(cache_uri):
|
||||
json_obj = await manager_util.get_data(cache_uri)
|
||||
@@ -2205,7 +2194,7 @@ async def get_data_by_mode(mode, filename, channel_url=None):
|
||||
with open(cache_uri, "w", encoding='utf-8') as file:
|
||||
json.dump(json_obj, file, indent=4, sort_keys=True)
|
||||
except Exception as e:
|
||||
print(f"[ComfyUI-Manager] Due to a network error, switching to local mode.\n=> {filename}\n=> {e}")
|
||||
print(f"[ComfyUI-Manager] Due to a network error, switching to local mode.\n=> {filename} @ {channel_url}/{mode}\n=> {e}")
|
||||
uri = os.path.join(manager_util.comfyui_manager_path, filename)
|
||||
json_obj = await manager_util.get_data(uri)
|
||||
|
||||
@@ -2276,7 +2265,7 @@ def gitclone_uninstall(files):
|
||||
url = url[:-1]
|
||||
try:
|
||||
for custom_nodes_dir in get_custom_nodes_paths():
|
||||
dir_name = os.path.splitext(os.path.basename(url))[0].replace(".git", "")
|
||||
dir_name:str = os.path.splitext(os.path.basename(url))[0].replace(".git", "")
|
||||
dir_path = os.path.join(custom_nodes_dir, dir_name)
|
||||
|
||||
# safety check
|
||||
@@ -2324,7 +2313,7 @@ def gitclone_set_active(files, is_disable):
|
||||
url = url[:-1]
|
||||
try:
|
||||
for custom_nodes_dir in get_custom_nodes_paths():
|
||||
dir_name = os.path.splitext(os.path.basename(url))[0].replace(".git", "")
|
||||
dir_name:str = os.path.splitext(os.path.basename(url))[0].replace(".git", "")
|
||||
dir_path = os.path.join(custom_nodes_dir, dir_name)
|
||||
|
||||
# safety check
|
||||
@@ -2421,7 +2410,7 @@ def update_to_stable_comfyui(repo_path):
|
||||
repo = git.Repo(repo_path)
|
||||
try:
|
||||
repo.git.checkout(repo.heads.master)
|
||||
except:
|
||||
except Exception:
|
||||
logging.error(f"[ComfyUI-Manager] Failed to checkout 'master' branch.\nrepo_path={repo_path}\nAvailable branches:")
|
||||
for branch in repo.branches:
|
||||
logging.error('\t'+branch.name)
|
||||
@@ -2444,7 +2433,7 @@ def update_to_stable_comfyui(repo_path):
|
||||
logging.info(f"[ComfyUI-Manager] Updating ComfyUI: {current_tag} -> {latest_tag}")
|
||||
repo.git.checkout(latest_tag)
|
||||
return 'updated', latest_tag
|
||||
except:
|
||||
except Exception:
|
||||
traceback.print_exc()
|
||||
return "fail", None
|
||||
|
||||
@@ -2597,7 +2586,7 @@ async def get_current_snapshot(custom_nodes_only = False):
|
||||
await unified_manager.get_custom_nodes('default', 'cache')
|
||||
|
||||
# Get ComfyUI hash
|
||||
repo_path = comfy_path
|
||||
repo_path = context.comfy_path
|
||||
|
||||
comfyui_commit_hash = None
|
||||
if not custom_nodes_only:
|
||||
@@ -2642,7 +2631,7 @@ async def get_current_snapshot(custom_nodes_only = False):
|
||||
commit_hash = git_utils.get_commit_hash(fullpath)
|
||||
url = git_utils.git_url(fullpath)
|
||||
git_custom_nodes[url] = dict(hash=commit_hash, disabled=is_disabled)
|
||||
except:
|
||||
except Exception:
|
||||
print(f"Failed to extract snapshots for the custom node '{path}'.")
|
||||
|
||||
elif path.endswith('.py'):
|
||||
@@ -2673,7 +2662,7 @@ async def save_snapshot_with_postfix(postfix, path=None, custom_nodes_only = Fal
|
||||
date_time_format = now.strftime("%Y-%m-%d_%H-%M-%S")
|
||||
file_name = f"{date_time_format}_{postfix}"
|
||||
|
||||
path = os.path.join(manager_snapshot_path, f"{file_name}.json")
|
||||
path = os.path.join(context.manager_snapshot_path, f"{file_name}.json")
|
||||
else:
|
||||
file_name = path.replace('\\', '/').split('/')[-1]
|
||||
file_name = file_name.split('.')[-2]
|
||||
@@ -2700,7 +2689,7 @@ async def extract_nodes_from_workflow(filepath, mode='local', channel_url='defau
|
||||
with open(filepath, "r", encoding="UTF-8", errors="ignore") as json_file:
|
||||
try:
|
||||
workflow = json.load(json_file)
|
||||
except:
|
||||
except Exception:
|
||||
print(f"Invalid workflow file: {filepath}")
|
||||
exit(-1)
|
||||
|
||||
@@ -2713,7 +2702,7 @@ async def extract_nodes_from_workflow(filepath, mode='local', channel_url='defau
|
||||
else:
|
||||
try:
|
||||
workflow = json.loads(img.info['workflow'])
|
||||
except:
|
||||
except Exception:
|
||||
print(f"This is not a valid .png file containing a ComfyUI workflow: {filepath}")
|
||||
exit(-1)
|
||||
|
||||
@@ -2861,7 +2850,7 @@ async def get_unified_total_nodes(channel, mode, regsitry_cache_mode='cache'):
|
||||
|
||||
if cnr_id is not None:
|
||||
# cnr or nightly version
|
||||
cnr_ids.remove(cnr_id)
|
||||
cnr_ids.discard(cnr_id)
|
||||
updatable = False
|
||||
cnr = unified_manager.cnr_map[cnr_id]
|
||||
|
||||
@@ -2984,7 +2973,7 @@ def populate_github_stats(node_packs, json_obj_github):
|
||||
v['stars'] = -1
|
||||
v['last_update'] = -1
|
||||
v['trust'] = False
|
||||
except:
|
||||
except Exception:
|
||||
logging.error(f"[ComfyUI-Manager] DB item is broken:\n{v}")
|
||||
|
||||
|
||||
@@ -3025,6 +3014,11 @@ async def restore_snapshot(snapshot_path, git_helper_extras=None):
|
||||
info = yaml.load(snapshot_file, Loader=yaml.SafeLoader)
|
||||
info = info['custom_nodes']
|
||||
|
||||
if 'pips' in info and info['pips']:
|
||||
pips = info['pips']
|
||||
else:
|
||||
pips = {}
|
||||
|
||||
# for cnr restore
|
||||
cnr_info = info.get('cnr_custom_nodes')
|
||||
if cnr_info is not None:
|
||||
@@ -3231,6 +3225,8 @@ async def restore_snapshot(snapshot_path, git_helper_extras=None):
|
||||
unified_manager.repo_install(repo_url, to_path, instant_execution=True, no_deps=False, return_postinstall=False)
|
||||
cloned_repos.append(repo_name)
|
||||
|
||||
manager_util.restore_pip_snapshot(pips, git_helper_extras)
|
||||
|
||||
# print summary
|
||||
for x in cloned_repos:
|
||||
print(f"[ INSTALLED ] {x}")
|
||||
@@ -3255,12 +3251,12 @@ async def restore_snapshot(snapshot_path, git_helper_extras=None):
|
||||
|
||||
def get_comfyui_versions(repo=None):
|
||||
if repo is None:
|
||||
repo = git.Repo(comfy_path)
|
||||
repo = git.Repo(context.comfy_path)
|
||||
|
||||
try:
|
||||
remote = get_remote_name(repo)
|
||||
repo.remotes[remote].fetch()
|
||||
except:
|
||||
except Exception:
|
||||
logging.error("[ComfyUI-Manager] Failed to fetch ComfyUI")
|
||||
|
||||
versions = [x.name for x in repo.tags if x.name.startswith('v')]
|
||||
@@ -3289,7 +3285,7 @@ def get_comfyui_versions(repo=None):
|
||||
|
||||
|
||||
def switch_comfyui(tag):
|
||||
repo = git.Repo(comfy_path)
|
||||
repo = git.Repo(context.comfy_path)
|
||||
|
||||
if tag == 'nightly':
|
||||
repo.git.checkout('master')
|
||||
@@ -3329,5 +3325,5 @@ def repo_switch_commit(repo_path, commit_hash):
|
||||
|
||||
repo.git.checkout(commit_hash)
|
||||
return True
|
||||
except:
|
||||
except Exception:
|
||||
return None
|
||||
File diff suppressed because it is too large
Load Diff
451
comfyui_manager/legacy/share_3rdparty.py
Normal file
451
comfyui_manager/legacy/share_3rdparty.py
Normal file
@@ -0,0 +1,451 @@
|
||||
import mimetypes
|
||||
from ..common import context
|
||||
from . import manager_core as core
|
||||
|
||||
import os
|
||||
from aiohttp import web
|
||||
import aiohttp
|
||||
import json
|
||||
import hashlib
|
||||
|
||||
import folder_paths
|
||||
from server import PromptServer
|
||||
import logging
|
||||
import sys
|
||||
|
||||
|
||||
try:
|
||||
from nio import AsyncClient, LoginResponse, UploadResponse
|
||||
matrix_nio_is_available = True
|
||||
except Exception:
|
||||
logging.warning(f"[ComfyUI-Manager] The matrix sharing feature has been disabled because the `matrix-nio` dependency is not installed.\n\tTo use this feature, please run the following command:\n\t{sys.executable} -m pip install matrix-nio\n")
|
||||
matrix_nio_is_available = False
|
||||
|
||||
|
||||
def extract_model_file_names(json_data):
|
||||
"""Extract unique file names from the input JSON data."""
|
||||
file_names = set()
|
||||
model_filename_extensions = {'.safetensors', '.ckpt', '.pt', '.pth', '.bin'}
|
||||
|
||||
# Recursively search for file names in the JSON data
|
||||
def recursive_search(data):
|
||||
if isinstance(data, dict):
|
||||
for value in data.values():
|
||||
recursive_search(value)
|
||||
elif isinstance(data, list):
|
||||
for item in data:
|
||||
recursive_search(item)
|
||||
elif isinstance(data, str) and '.' in data:
|
||||
file_names.add(os.path.basename(data)) # file_names.add(data)
|
||||
|
||||
recursive_search(json_data)
|
||||
return [f for f in list(file_names) if os.path.splitext(f)[1] in model_filename_extensions]
|
||||
|
||||
|
||||
def find_file_paths(base_dir, file_names):
|
||||
"""Find the paths of the files in the base directory."""
|
||||
file_paths = {}
|
||||
|
||||
for root, dirs, files in os.walk(base_dir):
|
||||
# Exclude certain directories
|
||||
dirs[:] = [d for d in dirs if d not in ['.git']]
|
||||
|
||||
for file in files:
|
||||
if file in file_names:
|
||||
file_paths[file] = os.path.join(root, file)
|
||||
return file_paths
|
||||
|
||||
|
||||
def compute_sha256_checksum(filepath):
|
||||
"""Compute the SHA256 checksum of a file, in chunks"""
|
||||
sha256 = hashlib.sha256()
|
||||
with open(filepath, 'rb') as f:
|
||||
for chunk in iter(lambda: f.read(4096), b''):
|
||||
sha256.update(chunk)
|
||||
return sha256.hexdigest()
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/share_option")
|
||||
async def share_option(request):
|
||||
if "value" in request.rel_url.query:
|
||||
core.get_config()['share_option'] = request.rel_url.query['value']
|
||||
core.write_config()
|
||||
else:
|
||||
return web.Response(text=core.get_config()['share_option'], status=200)
|
||||
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
def get_openart_auth():
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, ".openart_key")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(context.manager_files_path, ".openart_key"), "r") as f:
|
||||
openart_key = f.read().strip()
|
||||
return openart_key if openart_key else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_matrix_auth():
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, "matrix_auth")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(context.manager_files_path, "matrix_auth"), "r") as f:
|
||||
matrix_auth = f.read()
|
||||
homeserver, username, password = matrix_auth.strip().split("\n")
|
||||
if not homeserver or not username or not password:
|
||||
return None
|
||||
return {
|
||||
"homeserver": homeserver,
|
||||
"username": username,
|
||||
"password": password,
|
||||
}
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_comfyworkflows_auth():
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, "comfyworkflows_sharekey")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(context.manager_files_path, "comfyworkflows_sharekey"), "r") as f:
|
||||
share_key = f.read()
|
||||
if not share_key.strip():
|
||||
return None
|
||||
return share_key
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_youml_settings():
|
||||
if not os.path.exists(os.path.join(context.manager_files_path, ".youml")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(context.manager_files_path, ".youml"), "r") as f:
|
||||
youml_settings = f.read().strip()
|
||||
return youml_settings if youml_settings else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def set_youml_settings(settings):
|
||||
with open(os.path.join(context.manager_files_path, ".youml"), "w") as f:
|
||||
f.write(settings)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_openart_auth")
|
||||
async def api_get_openart_auth(request):
|
||||
# print("Getting stored Matrix credentials...")
|
||||
openart_key = get_openart_auth()
|
||||
if not openart_key:
|
||||
return web.Response(status=404)
|
||||
return web.json_response({"openart_key": openart_key})
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/v2/manager/set_openart_auth")
|
||||
async def api_set_openart_auth(request):
|
||||
json_data = await request.json()
|
||||
openart_key = json_data['openart_key']
|
||||
with open(os.path.join(context.manager_files_path, ".openart_key"), "w") as f:
|
||||
f.write(openart_key)
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_matrix_auth")
|
||||
async def api_get_matrix_auth(request):
|
||||
# print("Getting stored Matrix credentials...")
|
||||
matrix_auth = get_matrix_auth()
|
||||
if not matrix_auth:
|
||||
return web.Response(status=404)
|
||||
return web.json_response(matrix_auth)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/youml/settings")
|
||||
async def api_get_youml_settings(request):
|
||||
youml_settings = get_youml_settings()
|
||||
if not youml_settings:
|
||||
return web.Response(status=404)
|
||||
return web.json_response(json.loads(youml_settings))
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/v2/manager/youml/settings")
|
||||
async def api_set_youml_settings(request):
|
||||
json_data = await request.json()
|
||||
set_youml_settings(json.dumps(json_data))
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_comfyworkflows_auth")
|
||||
async def api_get_comfyworkflows_auth(request):
|
||||
# Check if the user has provided Matrix credentials in a file called 'matrix_accesstoken'
|
||||
# in the same directory as the ComfyUI base folder
|
||||
# print("Getting stored Comfyworkflows.com auth...")
|
||||
comfyworkflows_auth = get_comfyworkflows_auth()
|
||||
if not comfyworkflows_auth:
|
||||
return web.Response(status=404)
|
||||
return web.json_response({"comfyworkflows_sharekey": comfyworkflows_auth})
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/v2/manager/set_esheep_workflow_and_images")
|
||||
async def set_esheep_workflow_and_images(request):
|
||||
json_data = await request.json()
|
||||
with open(os.path.join(context.manager_files_path, "esheep_share_message.json"), "w", encoding='utf-8') as file:
|
||||
json.dump(json_data, file, indent=4)
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_esheep_workflow_and_images")
|
||||
async def get_esheep_workflow_and_images(request):
|
||||
with open(os.path.join(context.manager_files_path, "esheep_share_message.json"), 'r', encoding='utf-8') as file:
|
||||
data = json.load(file)
|
||||
return web.Response(status=200, text=json.dumps(data))
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/v2/manager/get_matrix_dep_status")
|
||||
async def get_matrix_dep_status(request):
|
||||
if matrix_nio_is_available:
|
||||
return web.Response(status=200, text='available')
|
||||
else:
|
||||
return web.Response(status=200, text='unavailable')
|
||||
|
||||
|
||||
def set_matrix_auth(json_data):
|
||||
homeserver = json_data['homeserver']
|
||||
username = json_data['username']
|
||||
password = json_data['password']
|
||||
with open(os.path.join(context.manager_files_path, "matrix_auth"), "w") as f:
|
||||
f.write("\n".join([homeserver, username, password]))
|
||||
|
||||
|
||||
def set_comfyworkflows_auth(comfyworkflows_sharekey):
|
||||
with open(os.path.join(context.manager_files_path, "comfyworkflows_sharekey"), "w") as f:
|
||||
f.write(comfyworkflows_sharekey)
|
||||
|
||||
|
||||
def has_provided_matrix_auth(matrix_auth):
|
||||
return matrix_auth['homeserver'].strip() and matrix_auth['username'].strip() and matrix_auth['password'].strip()
|
||||
|
||||
|
||||
def has_provided_comfyworkflows_auth(comfyworkflows_sharekey):
|
||||
return comfyworkflows_sharekey.strip()
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/v2/manager/share")
|
||||
async def share_art(request):
|
||||
# get json data
|
||||
json_data = await request.json()
|
||||
|
||||
matrix_auth = json_data['matrix_auth']
|
||||
comfyworkflows_sharekey = json_data['cw_auth']['cw_sharekey']
|
||||
|
||||
set_matrix_auth(matrix_auth)
|
||||
set_comfyworkflows_auth(comfyworkflows_sharekey)
|
||||
|
||||
share_destinations = json_data['share_destinations']
|
||||
credits = json_data['credits']
|
||||
title = json_data['title']
|
||||
description = json_data['description']
|
||||
is_nsfw = json_data['is_nsfw']
|
||||
prompt = json_data['prompt']
|
||||
potential_outputs = json_data['potential_outputs']
|
||||
selected_output_index = json_data['selected_output_index']
|
||||
|
||||
try:
|
||||
output_to_share = potential_outputs[int(selected_output_index)]
|
||||
except Exception:
|
||||
# for now, pick the first output
|
||||
output_to_share = potential_outputs[0]
|
||||
|
||||
assert output_to_share['type'] in ('image', 'output')
|
||||
output_dir = folder_paths.get_output_directory()
|
||||
|
||||
if output_to_share['type'] == 'image':
|
||||
asset_filename = output_to_share['image']['filename']
|
||||
asset_subfolder = output_to_share['image']['subfolder']
|
||||
|
||||
if output_to_share['image']['type'] == 'temp':
|
||||
output_dir = folder_paths.get_temp_directory()
|
||||
else:
|
||||
asset_filename = output_to_share['output']['filename']
|
||||
asset_subfolder = output_to_share['output']['subfolder']
|
||||
|
||||
if asset_subfolder:
|
||||
asset_filepath = os.path.join(output_dir, asset_subfolder, asset_filename)
|
||||
else:
|
||||
asset_filepath = os.path.join(output_dir, asset_filename)
|
||||
|
||||
# get the mime type of the asset
|
||||
assetFileType = mimetypes.guess_type(asset_filepath)[0]
|
||||
|
||||
share_website_host = "UNKNOWN"
|
||||
if "comfyworkflows" in share_destinations:
|
||||
share_website_host = "https://comfyworkflows.com"
|
||||
share_endpoint = f"{share_website_host}/api"
|
||||
|
||||
# get presigned urls
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with session.post(
|
||||
f"{share_endpoint}/get_presigned_urls",
|
||||
json={
|
||||
"assetFileName": asset_filename,
|
||||
"assetFileType": assetFileType,
|
||||
"workflowJsonFileName": 'workflow.json',
|
||||
"workflowJsonFileType": 'application/json',
|
||||
},
|
||||
) as resp:
|
||||
assert resp.status == 200
|
||||
presigned_urls_json = await resp.json()
|
||||
assetFilePresignedUrl = presigned_urls_json["assetFilePresignedUrl"]
|
||||
assetFileKey = presigned_urls_json["assetFileKey"]
|
||||
workflowJsonFilePresignedUrl = presigned_urls_json["workflowJsonFilePresignedUrl"]
|
||||
workflowJsonFileKey = presigned_urls_json["workflowJsonFileKey"]
|
||||
|
||||
# upload asset
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with session.put(assetFilePresignedUrl, data=open(asset_filepath, "rb")) as resp:
|
||||
assert resp.status == 200
|
||||
|
||||
# upload workflow json
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with session.put(workflowJsonFilePresignedUrl, data=json.dumps(prompt['workflow']).encode('utf-8')) as resp:
|
||||
assert resp.status == 200
|
||||
|
||||
model_filenames = extract_model_file_names(prompt['workflow'])
|
||||
model_file_paths = find_file_paths(folder_paths.base_path, model_filenames)
|
||||
|
||||
models_info = {}
|
||||
for filename, filepath in model_file_paths.items():
|
||||
models_info[filename] = {
|
||||
"filename": filename,
|
||||
"sha256_checksum": compute_sha256_checksum(filepath),
|
||||
"relative_path": os.path.relpath(filepath, folder_paths.base_path),
|
||||
}
|
||||
|
||||
# make a POST request to /api/upload_workflow with form data key values
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
form = aiohttp.FormData()
|
||||
if comfyworkflows_sharekey:
|
||||
form.add_field("shareKey", comfyworkflows_sharekey)
|
||||
form.add_field("source", "comfyui_manager")
|
||||
form.add_field("assetFileKey", assetFileKey)
|
||||
form.add_field("assetFileType", assetFileType)
|
||||
form.add_field("workflowJsonFileKey", workflowJsonFileKey)
|
||||
form.add_field("sharedWorkflowWorkflowJsonString", json.dumps(prompt['workflow']))
|
||||
form.add_field("sharedWorkflowPromptJsonString", json.dumps(prompt['output']))
|
||||
form.add_field("shareWorkflowCredits", credits)
|
||||
form.add_field("shareWorkflowTitle", title)
|
||||
form.add_field("shareWorkflowDescription", description)
|
||||
form.add_field("shareWorkflowIsNSFW", str(is_nsfw).lower())
|
||||
form.add_field("currentSnapshot", json.dumps(await core.get_current_snapshot()))
|
||||
form.add_field("modelsInfo", json.dumps(models_info))
|
||||
|
||||
async with session.post(
|
||||
f"{share_endpoint}/upload_workflow",
|
||||
data=form,
|
||||
) as resp:
|
||||
assert resp.status == 200
|
||||
upload_workflow_json = await resp.json()
|
||||
workflowId = upload_workflow_json["workflowId"]
|
||||
|
||||
# check if the user has provided Matrix credentials
|
||||
if matrix_nio_is_available and "matrix" in share_destinations:
|
||||
comfyui_share_room_id = '!LGYSoacpJPhIfBqVfb:matrix.org'
|
||||
filename = os.path.basename(asset_filepath)
|
||||
content_type = assetFileType
|
||||
|
||||
try:
|
||||
homeserver = 'matrix.org'
|
||||
if matrix_auth:
|
||||
homeserver = matrix_auth.get('homeserver', 'matrix.org')
|
||||
homeserver = homeserver.replace("http://", "https://")
|
||||
if not homeserver.startswith("https://"):
|
||||
homeserver = "https://" + homeserver
|
||||
|
||||
client = AsyncClient(homeserver, matrix_auth['username'])
|
||||
|
||||
# Login
|
||||
login_resp = await client.login(matrix_auth['password'])
|
||||
if not isinstance(login_resp, LoginResponse) or not login_resp.access_token:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Invalid Matrix credentials."}, content_type='application/json', status=400)
|
||||
|
||||
# Upload asset
|
||||
with open(asset_filepath, 'rb') as f:
|
||||
upload_resp, _maybe_keys = await client.upload(f, content_type=content_type, filename=filename)
|
||||
asset_data = f.seek(0) or f.read() # get size for info below
|
||||
if not isinstance(upload_resp, UploadResponse) or not upload_resp.content_uri:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Failed to upload asset to Matrix."}, content_type='application/json', status=500)
|
||||
mxc_url = upload_resp.content_uri
|
||||
|
||||
# Upload workflow JSON
|
||||
import io
|
||||
workflow_json_bytes = json.dumps(prompt['workflow']).encode('utf-8')
|
||||
workflow_io = io.BytesIO(workflow_json_bytes)
|
||||
upload_workflow_resp, _maybe_keys = await client.upload(workflow_io, content_type='application/json', filename='workflow.json')
|
||||
workflow_io.seek(0)
|
||||
if not isinstance(upload_workflow_resp, UploadResponse) or not upload_workflow_resp.content_uri:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Failed to upload workflow to Matrix."}, content_type='application/json', status=500)
|
||||
workflow_json_mxc_url = upload_workflow_resp.content_uri
|
||||
|
||||
# Send text message
|
||||
text_content = ""
|
||||
if title:
|
||||
text_content += f"{title}\n"
|
||||
if description:
|
||||
text_content += f"{description}\n"
|
||||
if credits:
|
||||
text_content += f"\ncredits: {credits}\n"
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={"msgtype": "m.text", "body": text_content}
|
||||
)
|
||||
|
||||
# Send image
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={
|
||||
"msgtype": "m.image",
|
||||
"body": filename,
|
||||
"url": mxc_url,
|
||||
"info": {
|
||||
"mimetype": content_type,
|
||||
"size": len(asset_data)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Send workflow JSON file
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={
|
||||
"msgtype": "m.file",
|
||||
"body": "workflow.json",
|
||||
"url": workflow_json_mxc_url,
|
||||
"info": {
|
||||
"mimetype": "application/json",
|
||||
"size": len(workflow_json_bytes)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
await client.close()
|
||||
|
||||
except:
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return web.json_response({"error": "An error occurred when sharing your art to Matrix."}, content_type='application/json', status=500)
|
||||
|
||||
return web.json_response({
|
||||
"comfyworkflows": {
|
||||
"url": None if "comfyworkflows" not in share_destinations else f"{share_website_host}/workflows/{workflowId}",
|
||||
},
|
||||
"matrix": {
|
||||
"success": None if "matrix" not in share_destinations else True
|
||||
}
|
||||
}, content_type='application/json', status=200)
|
||||
@@ -749,8 +749,8 @@
|
||||
"save_path": "loras/HyperSD/SDXL",
|
||||
"description": "Hyper-SD LoRA (4steps) - SDXL",
|
||||
"reference": "https://huggingface.co/ByteDance/Hyper-SD",
|
||||
"filename": "Hyper-SD15-4steps-lora.safetensors",
|
||||
"url": "https://huggingface.co/ByteDance/Hyper-SD/resolve/main/Hyper-SD15-4steps-lora.safetensors",
|
||||
"filename": "Hyper-SDXL-4steps-lora.safetensors",
|
||||
"url": "https://huggingface.co/ByteDance/Hyper-SD/resolve/main/Hyper-SDXL-4steps-lora.safetensors",
|
||||
"size": "787MB"
|
||||
},
|
||||
{
|
||||
@@ -1973,6 +1973,97 @@
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything/sam_vit_b_01ec64.pth",
|
||||
"size": "375.0MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "sam2.1_hiera_tiny.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (tiny)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_tiny.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_tiny.pt",
|
||||
"size": "149.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2.1_hiera_small.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (small)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_small.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_small.pt",
|
||||
"size": "176.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2.1_hiera_base_plus.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (base+)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_base_plus.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_base_plus.pt",
|
||||
"size": "309.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2.1_hiera_large.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (large)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_large.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_large.pt",
|
||||
"size": "857.0MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "sam2_hiera_tiny.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (tiny)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_tiny.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_tiny.pt",
|
||||
"size": "149.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2_hiera_small.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (small)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_small.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_small.pt",
|
||||
"size": "176.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2_hiera_base_plus.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (base+)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_base_plus.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_base_plus.pt",
|
||||
"size": "309.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2_hiera_large.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (large)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_large.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_large.pt",
|
||||
"size": "857.0MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "seecoder v1.0",
|
||||
"type": "seecoder",
|
||||
@@ -4006,6 +4097,29 @@
|
||||
"size": "649MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/omnigen2_fp16.safetensors",
|
||||
"type": "diffusion_model",
|
||||
"base": "OmniGen2",
|
||||
"save_path": "default",
|
||||
"description": "OmniGen2 diffusion model. This is required for using OmniGen2.",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged",
|
||||
"filename": "omnigen2_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged/resolve/main/split_files/diffusion_models/omnigen2_fp16.safetensors",
|
||||
"size": "7.93GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/qwen_2.5_vl_fp16.safetensors",
|
||||
"type": "clip",
|
||||
"base": "qwen-2.5",
|
||||
"save_path": "default",
|
||||
"description": "text encoder for OmniGen2",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged",
|
||||
"filename": "qwen_2.5_vl_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged/resolve/main/split_files/text_encoders/qwen_2.5_vl_fp16.safetensors",
|
||||
"size": "7.51GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "FLUX.1 [Schnell] Diffusion model",
|
||||
"type": "diffusion_model",
|
||||
@@ -4023,7 +4137,7 @@
|
||||
"type": "VAE",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "vae/FLUX1",
|
||||
"description": "FLUX.1 VAE model",
|
||||
"description": "FLUX.1 VAE model\nNOTE: This VAE model can also be used for image generation with OmniGen2.",
|
||||
"reference": "https://huggingface.co/black-forest-labs/FLUX.1-schnell",
|
||||
"filename": "ae.safetensors",
|
||||
"url": "https://huggingface.co/black-forest-labs/FLUX.1-schnell/resolve/main/ae.safetensors",
|
||||
@@ -4931,6 +5045,105 @@
|
||||
"size": "1.26GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v high noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v high noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_high_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v high noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v high noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v low noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v low noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_low_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v low noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v low noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v high noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v high noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_high_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_high_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v high noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v high noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v low noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v low noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_low_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_low_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v low noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v low noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 ti2v 5B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for ti2v 5B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_ti2v_5B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_ti2v_5B_fp16.safetensors",
|
||||
"size": "10.0GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/umt5_xxl_fp16.safetensors",
|
||||
@@ -4953,6 +5166,195 @@
|
||||
"filename": "umt5_xxl_fp8_e4m3fn_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged/resolve/main/split_files/text_encoders/umt5_xxl_fp8_e4m3fn_scaled.safetensors",
|
||||
"size": "6.74GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "lllyasviel/FramePackI2V_HY",
|
||||
"type": "FramePackI2V",
|
||||
"base": "FramePackI2V",
|
||||
"save_path": "diffusers/lllyasviel",
|
||||
"description": "[SNAPSHOT] This is the f1k1_x_g9_f1k1f2k2f16k4_td FramePack for HY. [w/You cannot download this item on ComfyUI-Manager versions below V3.18]",
|
||||
"reference": "https://huggingface.co/lllyasviel/FramePackI2V_HY",
|
||||
"filename": "<huggingface>",
|
||||
"url": "lllyasviel/FramePackI2V_HY",
|
||||
"size": "25.75GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "LTX-Video Spatial Upscaler v0.9.7",
|
||||
"type": "upscale",
|
||||
"base": "upscale",
|
||||
"save_path": "default",
|
||||
"description": "Spatial upscaler model for LTX-Video. This model enhances the spatial resolution of generated videos.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-spatial-upscaler-0.9.7.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-spatial-upscaler-0.9.7.safetensors",
|
||||
"size": "505MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video Temporal Upscaler v0.9.7",
|
||||
"type": "upscale",
|
||||
"base": "upscale",
|
||||
"save_path": "default",
|
||||
"description": "Temporal upscaler model for LTX-Video. This model enhances the temporal resolution and smoothness of generated videos.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-temporal-upscaler-0.9.7.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-temporal-upscaler-0.9.7.safetensors",
|
||||
"size": "524MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "High-resolution quality LTX-Video 13B model.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-dev.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-dev.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B FP8 v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Quantized version of the LTX-Video 13B model, optimized for lower VRAM usage while maintaining high quality.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-dev-fp8.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-dev-fp8.safetensors",
|
||||
"size": "15.7GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Distilled version of the LTX-Video 13B model, providing improved efficiency while maintaining high-resolution quality.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-distilled.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-distilled.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled FP8 v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Quantized distilled version of the LTX-Video 13B model, optimized for even lower VRAM usage while maintaining quality.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-distilled-fp8.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-distilled-fp8.safetensors",
|
||||
"size": "15.7GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 2B Distilled v0.9.8",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "LTX-Video 2B distilled model v0.9.8 with improved prompt understanding and detail generation.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-2b-0.9.8-distilled.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-2b-0.9.8-distilled.safetensors",
|
||||
"size": "6.34GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 2B Distilled FP8 v0.9.8",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Quantized LTX-Video 2B distilled model v0.9.8 with improved prompt understanding and detail generation, optimized for lower VRAM usage.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-2b-0.9.8-distilled-fp8.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-2b-0.9.8-distilled-fp8.safetensors",
|
||||
"size": "4.46GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled v0.9.8",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "LTX-Video 13B distilled model v0.9.8 with improved prompt understanding and detail generation.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.8-distilled.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.8-distilled.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled FP8 v0.9.8",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Quantized LTX-Video 13B distilled model v0.9.8 with improved prompt understanding and detail generation, optimized for lower VRAM usage.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.8-distilled-fp8.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.8-distilled-fp8.safetensors",
|
||||
"size": "15.7GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled LoRA v0.9.7",
|
||||
"type": "lora",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "loras",
|
||||
"description": "A LoRA adapter that transforms the standard LTX-Video 13B model into a distilled version when loaded.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-distilled-lora128.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-distilled-lora128.safetensors",
|
||||
"size": "1.33GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video ICLoRA Depth 13B v0.9.7",
|
||||
"type": "lora",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "loras",
|
||||
"description": "In-Context LoRA (IC LoRA) for depth-controlled video-to-video generation with precise depth conditioning.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-depth-13b-0.9.7",
|
||||
"filename": "ltxv-097-ic-lora-depth-control-comfyui.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-depth-13b-0.9.7/resolve/main/ltxv-097-ic-lora-depth-control-comfyui.safetensors",
|
||||
"size": "81.9MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video ICLoRA Pose 13B v0.9.7",
|
||||
"type": "lora",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "loras",
|
||||
"description": "In-Context LoRA (IC LoRA) for pose-controlled video-to-video generation with precise pose conditioning.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-pose-13b-0.9.7",
|
||||
"filename": "ltxv-097-ic-lora-pose-control-comfyui.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-pose-13b-0.9.7/resolve/main/ltxv-097-ic-lora-pose-control-comfyui.safetensors",
|
||||
"size": "151MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video ICLoRA Canny 13B v0.9.7",
|
||||
"type": "lora",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "loras",
|
||||
"description": "In-Context LoRA (IC LoRA) for canny edge-controlled video-to-video generation with precise edge conditioning.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-canny-13b-0.9.7",
|
||||
"filename": "ltxv-097-ic-lora-canny-control-comfyui.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-canny-13b-0.9.7/resolve/main/ltxv-097-ic-lora-canny-control-comfyui.safetensors",
|
||||
"size": "81.9MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video ICLoRA Detailer 13B v0.9.8",
|
||||
"type": "lora",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "loras",
|
||||
"description": "A video detailer model on top of LTXV_13B_098_DEV trained on custom data using In-Context LoRA (IC LoRA) method.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-detailer-13b-0.9.8",
|
||||
"filename": "ltxv-098-ic-lora-detailer-comfyui.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video-ICLoRA-detailer-13b-0.9.8/resolve/main/ltxv-098-ic-lora-detailer-comfyui.safetensors",
|
||||
"size": "1.31GB"
|
||||
},
|
||||
{
|
||||
"name": "Latent Bridge Matching for Image Relighting",
|
||||
"type": "diffusion_model",
|
||||
"base": "LBM",
|
||||
"save_path": "diffusion_models/LBM",
|
||||
"description": "Latent Bridge Matching (LBM) Relighting model",
|
||||
"reference": "https://huggingface.co/jasperai/LBM_relighting",
|
||||
"filename": "LBM_relighting.safetensors",
|
||||
"url": "https://huggingface.co/jasperai/LBM_relighting/resolve/main/model.safetensors",
|
||||
"size": "5.02GB"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -12,13 +12,10 @@ import ast
|
||||
import logging
|
||||
import traceback
|
||||
|
||||
glob_path = os.path.join(os.path.dirname(__file__), "glob")
|
||||
sys.path.append(glob_path)
|
||||
|
||||
import security_check
|
||||
import manager_util
|
||||
import cm_global
|
||||
import manager_downloader
|
||||
from .common import security_check
|
||||
from .common import manager_util
|
||||
from .common import cm_global
|
||||
from .common import manager_downloader
|
||||
import folder_paths
|
||||
|
||||
manager_util.add_python_path_to_env()
|
||||
@@ -38,7 +35,6 @@ else:
|
||||
def current_timestamp():
|
||||
return str(time.time()).split('.')[0]
|
||||
|
||||
security_check.security_check()
|
||||
|
||||
cm_global.pip_blacklist = {'torch', 'torchaudio', 'torchsde', 'torchvision'}
|
||||
cm_global.pip_downgrade_blacklist = ['torch', 'torchaudio', 'torchsde', 'torchvision', 'transformers', 'safetensors', 'kornia']
|
||||
@@ -67,16 +63,14 @@ def is_import_failed_extension(name):
|
||||
comfy_path = os.environ.get('COMFYUI_PATH')
|
||||
comfy_base_path = os.environ.get('COMFYUI_FOLDERS_BASE_PATH')
|
||||
|
||||
if comfy_path is None:
|
||||
# legacy env var
|
||||
comfy_path = os.environ.get('COMFYUI_PATH')
|
||||
|
||||
if comfy_path is None:
|
||||
comfy_path = os.path.abspath(os.path.dirname(sys.modules['__main__'].__file__))
|
||||
os.environ['COMFYUI_PATH'] = comfy_path
|
||||
|
||||
if comfy_base_path is None:
|
||||
comfy_base_path = comfy_path
|
||||
|
||||
|
||||
sys.__comfyui_manager_register_message_collapse = register_message_collapse
|
||||
sys.__comfyui_manager_is_import_failed_extension = is_import_failed_extension
|
||||
cm_global.register_api('cm.register_message_collapse', register_message_collapse)
|
||||
@@ -92,9 +86,6 @@ manager_pip_blacklist_path = os.path.join(manager_files_path, "pip_blacklist.lis
|
||||
restore_snapshot_path = os.path.join(manager_files_path, "startup-scripts", "restore-snapshot.json")
|
||||
manager_config_path = os.path.join(manager_files_path, 'config.ini')
|
||||
|
||||
cm_cli_path = os.path.join(comfyui_manager_path, "cm-cli.py")
|
||||
|
||||
|
||||
default_conf = {}
|
||||
|
||||
def read_config():
|
||||
@@ -119,13 +110,14 @@ def check_file_logging():
|
||||
|
||||
read_config()
|
||||
read_uv_mode()
|
||||
security_check.security_check()
|
||||
check_file_logging()
|
||||
|
||||
cm_global.pip_overrides = {'numpy': 'numpy<2'}
|
||||
cm_global.pip_overrides = {}
|
||||
|
||||
if os.path.exists(manager_pip_overrides_path):
|
||||
with open(manager_pip_overrides_path, 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
cm_global.pip_overrides = json.load(json_file)
|
||||
cm_global.pip_overrides['numpy'] = 'numpy<2'
|
||||
|
||||
|
||||
if os.path.exists(manager_pip_blacklist_path):
|
||||
@@ -338,7 +330,12 @@ try:
|
||||
log_file.write(message)
|
||||
else:
|
||||
log_file.write(f"[{timestamp}] {message}")
|
||||
log_file.flush()
|
||||
|
||||
try:
|
||||
log_file.flush()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
self.last_char = message if message == '' else message[-1]
|
||||
|
||||
if not file_only:
|
||||
@@ -351,7 +348,10 @@ try:
|
||||
original_stderr.flush()
|
||||
|
||||
def flush(self):
|
||||
log_file.flush()
|
||||
try:
|
||||
log_file.flush()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
with std_log_lock:
|
||||
if self.is_stdout:
|
||||
@@ -392,7 +392,11 @@ try:
|
||||
def emit(self, record):
|
||||
global is_start_mode
|
||||
|
||||
message = record.getMessage()
|
||||
try:
|
||||
message = record.getMessage()
|
||||
except Exception as e:
|
||||
message = f"<<logging error>>: {record} - {e}"
|
||||
original_stderr.write(message)
|
||||
|
||||
if is_start_mode:
|
||||
match = re.search(pat_import_fail, message)
|
||||
@@ -435,35 +439,6 @@ except Exception as e:
|
||||
print(f"[ComfyUI-Manager] Logging failed: {e}")
|
||||
|
||||
|
||||
def ensure_dependencies():
|
||||
try:
|
||||
import git # noqa: F401
|
||||
import toml # noqa: F401
|
||||
import rich # noqa: F401
|
||||
import chardet # noqa: F401
|
||||
except ModuleNotFoundError:
|
||||
my_path = os.path.dirname(__file__)
|
||||
requirements_path = os.path.join(my_path, "requirements.txt")
|
||||
|
||||
print("## ComfyUI-Manager: installing dependencies. (GitPython)")
|
||||
try:
|
||||
subprocess.check_output(manager_util.make_pip_cmd(['install', '-r', requirements_path]))
|
||||
except subprocess.CalledProcessError:
|
||||
print("## [ERROR] ComfyUI-Manager: Attempting to reinstall dependencies using an alternative method.")
|
||||
try:
|
||||
subprocess.check_output(manager_util.make_pip_cmd(['install', '--user', '-r', requirements_path]))
|
||||
except subprocess.CalledProcessError:
|
||||
print("## [ERROR] ComfyUI-Manager: Failed to install the GitPython package in the correct Python environment. Please install it manually in the appropriate environment. (You can seek help at https://app.element.io/#/room/%23comfyui_space%3Amatrix.org)")
|
||||
|
||||
try:
|
||||
print("## ComfyUI-Manager: installing dependencies done.")
|
||||
except:
|
||||
# maybe we should sys.exit() here? there is at least two screens worth of error messages still being pumped after our error messages
|
||||
print("## [ERROR] ComfyUI-Manager: GitPython package seems to be installed, but failed to load somehow. Make sure you have a working git client installed")
|
||||
|
||||
ensure_dependencies()
|
||||
|
||||
|
||||
print("** ComfyUI startup time:", current_timestamp())
|
||||
print("** Platform:", platform.system())
|
||||
print("** Python version:", sys.version)
|
||||
@@ -487,7 +462,7 @@ def read_downgrade_blacklist():
|
||||
items = [x.strip() for x in items if x != '']
|
||||
cm_global.pip_downgrade_blacklist += items
|
||||
cm_global.pip_downgrade_blacklist = list(set(cm_global.pip_downgrade_blacklist))
|
||||
except:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
@@ -593,7 +568,10 @@ if os.path.exists(restore_snapshot_path):
|
||||
if 'COMFYUI_FOLDERS_BASE_PATH' not in new_env:
|
||||
new_env["COMFYUI_FOLDERS_BASE_PATH"] = comfy_path
|
||||
|
||||
cmd_str = [sys.executable, cm_cli_path, 'restore-snapshot', restore_snapshot_path]
|
||||
if 'COMFYUI_PATH' not in new_env:
|
||||
new_env['COMFYUI_PATH'] = os.path.dirname(folder_paths.__file__)
|
||||
|
||||
cmd_str = [sys.executable, '-m', 'comfyui_manager.cm_cli', 'restore-snapshot', restore_snapshot_path]
|
||||
exit_code = process_wrap(cmd_str, custom_nodes_base_path, handler=msg_capture, env=new_env)
|
||||
|
||||
if exit_code != 0:
|
||||
@@ -620,6 +598,7 @@ def execute_lazy_install_script(repo_path, executable):
|
||||
lines = manager_util.robust_readlines(requirements_path)
|
||||
for line in lines:
|
||||
package_name = remap_pip_package(line.strip())
|
||||
package_name = package_name.split('#')[0].strip()
|
||||
if package_name and not is_installed(package_name):
|
||||
if '--index-url' in package_name:
|
||||
s = package_name.split('--index-url')
|
||||
41
docs/README.md
Normal file
41
docs/README.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# ComfyUI-Manager: Documentation
|
||||
|
||||
This directory contains documentation for the ComfyUI-Manager, providing guides and tutorials for users in multiple languages.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
The documentation is organized into language-specific directories:
|
||||
|
||||
- **en/**: English documentation
|
||||
- **ko/**: Korean documentation
|
||||
|
||||
## Core Documentation Files
|
||||
|
||||
### Command-Line Interface
|
||||
|
||||
- **cm-cli.md**: Documentation for the ComfyUI-Manager Command Line Interface (CLI), which allows using manager functionality without the UI.
|
||||
|
||||
### Advanced Features
|
||||
|
||||
- **use_aria2.md**: Guide for using the aria2 download accelerator with ComfyUI-Manager for faster model downloads.
|
||||
|
||||
## Documentation Standards
|
||||
|
||||
The documentation follows these standards:
|
||||
|
||||
1. **Markdown Format**: All documentation is written in Markdown for easy rendering on GitHub and other platforms
|
||||
2. **Language-specific Directories**: Content is separated by language to facilitate localization
|
||||
3. **Feature-focused Documentation**: Each major feature has its own documentation file
|
||||
4. **Updated with Releases**: Documentation is kept in sync with software releases
|
||||
|
||||
## Contributing to Documentation
|
||||
|
||||
When contributing new documentation:
|
||||
|
||||
1. Place files in the appropriate language directory
|
||||
2. Use clear, concise language appropriate for the target audience
|
||||
3. Include examples where helpful
|
||||
4. Consider adding screenshots or diagrams for complex features
|
||||
5. Maintain consistent formatting with existing documentation
|
||||
|
||||
This documentation directory will continue to grow to support the expanding feature set of ComfyUI-Manager.
|
||||
95
node_db/README.md
Normal file
95
node_db/README.md
Normal file
@@ -0,0 +1,95 @@
|
||||
# ComfyUI-Manager: Node Database (node_db)
|
||||
|
||||
This directory contains the JSON database files that power ComfyUI-Manager's legacy node registry system. While the manager is gradually transitioning to the online Custom Node Registry (CNR), these local JSON files continue to provide important metadata about custom nodes, models, and their integrations.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
The node_db directory is organized into several subdirectories, each serving a specific purpose:
|
||||
|
||||
- **dev/**: Development channel files with latest additions and experimental nodes
|
||||
- **legacy/**: Historical/legacy nodes that may require special handling
|
||||
- **new/**: New nodes that have passed initial verification but are still being evaluated
|
||||
- **forked/**: Forks of existing nodes with modifications
|
||||
- **tutorial/**: Example and tutorial nodes designed for learning purposes
|
||||
|
||||
## Core Database Files
|
||||
|
||||
Each subdirectory contains a standard set of JSON files:
|
||||
|
||||
- **custom-node-list.json**: Primary database of custom nodes with metadata
|
||||
- **extension-node-map.json**: Maps between extensions and individual nodes they provide
|
||||
- **model-list.json**: Catalog of models that can be downloaded through the manager
|
||||
- **alter-list.json**: Alternative implementations of nodes for compatibility or functionality
|
||||
- **github-stats.json**: GitHub repository statistics for node popularity metrics
|
||||
|
||||
## Database Schema
|
||||
|
||||
### custom-node-list.json
|
||||
```json
|
||||
{
|
||||
"custom_nodes": [
|
||||
{
|
||||
"title": "Node display name",
|
||||
"name": "Repository name",
|
||||
"reference": "Original repository if forked",
|
||||
"files": ["GitHub URL or other source location"],
|
||||
"install_type": "git",
|
||||
"description": "Description of the node's functionality",
|
||||
"pip": ["optional pip dependencies"],
|
||||
"js": ["optional JavaScript files"],
|
||||
"tags": ["categorization tags"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### extension-node-map.json
|
||||
```json
|
||||
{
|
||||
"extension-id": [
|
||||
["list", "of", "node", "classes"],
|
||||
{
|
||||
"author": "Author name",
|
||||
"description": "Extension description",
|
||||
"nodename_pattern": "Optional regex pattern for node name matching"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Transition to Custom Node Registry (CNR)
|
||||
|
||||
This local database system is being progressively replaced by the online Custom Node Registry (CNR), which provides:
|
||||
- Real-time updates without manual JSON maintenance
|
||||
- Improved versioning support
|
||||
- Better security validation
|
||||
- Enhanced metadata
|
||||
|
||||
The Manager supports both systems simultaneously during the transition period.
|
||||
|
||||
## Implementation Details
|
||||
|
||||
- The database follows a channel-based architecture for different sources
|
||||
- Multiple database modes are supported: Channel, Local, and Remote
|
||||
- The system supports differential updates to minimize bandwidth usage
|
||||
- Security levels are enforced for different node installations based on source
|
||||
|
||||
## Usage in the Application
|
||||
|
||||
The Manager's backend uses these database files to:
|
||||
|
||||
1. Provide browsable lists of available nodes and models
|
||||
2. Resolve dependencies for installation
|
||||
3. Track updates and new versions
|
||||
4. Map node classes to their source repositories
|
||||
5. Assess risk levels for installation security
|
||||
|
||||
## Maintenance Scripts
|
||||
|
||||
Each subdirectory contains a `scan.sh` script that assists with:
|
||||
- Scanning repositories for new nodes
|
||||
- Updating metadata
|
||||
- Validating database integrity
|
||||
- Generating proper JSON structures
|
||||
|
||||
This database system enables a flexible, secure, and comprehensive management system for the ComfyUI ecosystem while the transition to CNR continues.
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,3 @@
|
||||
#!/bin/bash
|
||||
rm ~/.tmp/dev/*.py > /dev/null 2>&1
|
||||
python ../../scanner.py ~/.tmp/dev
|
||||
python ../../scanner.py ~/.tmp/dev $*
|
||||
@@ -1,5 +1,25 @@
|
||||
{
|
||||
"custom_nodes": [
|
||||
{
|
||||
"author": "synchronicity-labs",
|
||||
"title": "ComfyUI Sync Lipsync Node",
|
||||
"reference": "https://github.com/synchronicity-labs/sync-comfyui",
|
||||
"files": [
|
||||
"https://github.com/synchronicity-labs/sync-comfyui"
|
||||
],
|
||||
"install_type": "git-clone",
|
||||
"description": "This custom node allows you to perform audio-video lip synchronization inside ComfyUI using a simple interface."
|
||||
},
|
||||
{
|
||||
"author": "joaomede",
|
||||
"title": "ComfyUI-Unload-Model-Fork",
|
||||
"reference": "https://github.com/joaomede/ComfyUI-Unload-Model-Fork",
|
||||
"files": [
|
||||
"https://github.com/joaomede/ComfyUI-Unload-Model-Fork"
|
||||
],
|
||||
"install_type": "git-clone",
|
||||
"description": "For unloading a model or all models, using the memory management that is already present in ComfyUI. Copied from [a/https://github.com/willblaschko/ComfyUI-Unload-Models](https://github.com/willblaschko/ComfyUI-Unload-Models) but without the unnecessary extra stuff."
|
||||
},
|
||||
{
|
||||
"author": "SanDiegoDude",
|
||||
"title": "ComfyUI-HiDream-Sampler [WIP]",
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,320 @@
|
||||
{
|
||||
"models": [
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v high noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v high noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_high_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v high noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v high noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v low noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v low noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_low_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 i2v low noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for i2v low noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v high noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v high noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_high_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_high_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v high noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v high noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_high_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v low noise 14B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v low noise 14B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_low_noise_14B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_low_noise_14B_fp16.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 t2v low noise 14B (fp8_scaled)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for t2v low noise 14B (fp8_scaled)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_t2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_t2v_low_noise_14B_fp8_scaled.safetensors",
|
||||
"size": "14.3GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.2 ti2v 5B (fp16)",
|
||||
"type": "diffusion_model",
|
||||
"base": "Wan2.2",
|
||||
"save_path": "diffusion_models/Wan2.2",
|
||||
"description": "Wan2.2 diffusion model for ti2v 5B (fp16)",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged",
|
||||
"filename": "wan2.2_ti2v_5B_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_ti2v_5B_fp16.safetensors",
|
||||
"size": "10.0GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "sam2.1_hiera_tiny.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (tiny)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_tiny.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_tiny.pt",
|
||||
"size": "149.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2.1_hiera_small.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (small)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_small.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_small.pt",
|
||||
"size": "176.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2.1_hiera_base_plus.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (base+)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_base_plus.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_base_plus.pt",
|
||||
"size": "309.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2.1_hiera_large.pt",
|
||||
"type": "sam2.1",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2.1 hiera model (large)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2.1_hiera_large.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/092824/sam2.1_hiera_large.pt",
|
||||
"size": "857.0MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "sam2_hiera_tiny.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (tiny)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_tiny.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_tiny.pt",
|
||||
"size": "149.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2_hiera_small.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (small)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_small.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_small.pt",
|
||||
"size": "176.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2_hiera_base_plus.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (base+)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_base_plus.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_base_plus.pt",
|
||||
"size": "309.0MB"
|
||||
},
|
||||
{
|
||||
"name": "sam2_hiera_large.pt",
|
||||
"type": "sam2",
|
||||
"base": "SAM",
|
||||
"save_path": "sams",
|
||||
"description": "Segmenty Anything SAM 2 hiera model (large)",
|
||||
"reference": "https://github.com/facebookresearch/sam2#model-description",
|
||||
"filename": "sam2_hiera_large.pt",
|
||||
"url": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_large.pt",
|
||||
"size": "857.0MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/omnigen2_fp16.safetensors",
|
||||
"type": "diffusion_model",
|
||||
"base": "OmniGen2",
|
||||
"save_path": "default",
|
||||
"description": "OmniGen2 diffusion model. This is required for using OmniGen2.",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged",
|
||||
"filename": "omnigen2_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged/resolve/main/split_files/diffusion_models/omnigen2_fp16.safetensors",
|
||||
"size": "7.93GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/qwen_2.5_vl_fp16.safetensors",
|
||||
"type": "clip",
|
||||
"base": "qwen-2.5",
|
||||
"save_path": "default",
|
||||
"description": "text encoder for OmniGen2",
|
||||
"reference": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged",
|
||||
"filename": "qwen_2.5_vl_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/Omnigen2_ComfyUI_repackaged/resolve/main/split_files/text_encoders/qwen_2.5_vl_fp16.safetensors",
|
||||
"size": "7.51GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Latent Bridge Matching for Image Relighting",
|
||||
"type": "diffusion_model",
|
||||
"base": "LBM",
|
||||
"save_path": "diffusion_models/LBM",
|
||||
"description": "Latent Bridge Matching (LBM) Relighting model",
|
||||
"reference": "https://huggingface.co/jasperai/LBM_relighting",
|
||||
"filename": "LBM_relighting.safetensors",
|
||||
"url": "https://huggingface.co/jasperai/LBM_relighting/resolve/main/model.safetensors",
|
||||
"size": "5.02GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Distilled version of the LTX-Video 13B model, providing improved efficiency while maintaining high-resolution quality.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-distilled.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-distilled.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled FP8 v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Quantized distilled version of the LTX-Video 13B model, optimized for even lower VRAM usage while maintaining quality.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-distilled-fp8.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-distilled-fp8.safetensors",
|
||||
"size": "15.7GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B Distilled LoRA v0.9.7",
|
||||
"type": "lora",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "loras",
|
||||
"description": "A LoRA adapter that transforms the standard LTX-Video 13B model into a distilled version when loaded.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-distilled-lora128.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-distilled-lora128.safetensors",
|
||||
"size": "1.33GB"
|
||||
},
|
||||
{
|
||||
"name": "lllyasviel/FramePackI2V_HY",
|
||||
"type": "FramePackI2V",
|
||||
"base": "FramePackI2V",
|
||||
"save_path": "diffusers/lllyasviel",
|
||||
"description": "[SNAPSHOT] This is the f1k1_x_g9_f1k1f2k2f16k4_td FramePack for HY. [w/You cannot download this item on ComfyUI-Manager versions below V3.18]",
|
||||
"reference": "https://huggingface.co/lllyasviel/FramePackI2V_HY",
|
||||
"filename": "<huggingface>",
|
||||
"url": "lllyasviel/FramePackI2V_HY",
|
||||
"size": "25.75GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "LTX-Video Spatial Upscaler v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Spatial upscaler model for LTX-Video. This model enhances the spatial resolution of generated videos.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-spatial-upscaler-0.9.7.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-spatial-upscaler-0.9.7.safetensors",
|
||||
"size": "505MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video Temporal Upscaler v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Temporal upscaler model for LTX-Video. This model enhances the temporal resolution and smoothness of generated videos.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-temporal-upscaler-0.9.7.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-temporal-upscaler-0.9.7.safetensors",
|
||||
"size": "524MB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "High-resolution quality LTX-Video 13B model.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-dev.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-dev.safetensors",
|
||||
"size": "28.6GB"
|
||||
},
|
||||
{
|
||||
"name": "LTX-Video 13B FP8 v0.9.7",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "Quantized version of the LTX-Video 13B model, optimized for lower VRAM usage while maintaining high quality.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltxv-13b-0.9.7-dev-fp8.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltxv-13b-0.9.7-dev-fp8.safetensors",
|
||||
"size": "15.7GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/Wan2.1 i2v 480p 14B (bf16)",
|
||||
"type": "diffusion_model",
|
||||
@@ -372,339 +687,6 @@
|
||||
"filename": "llava_llama3_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/HunyuanVideo_repackaged/resolve/main/split_files/text_encoders/llava_llama3_fp16.safetensors",
|
||||
"size": "16.1GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "PixArt-Sigma-XL-2-512-MS.safetensors (diffusion)",
|
||||
"type": "diffusion_model",
|
||||
"base": "pixart-sigma",
|
||||
"save_path": "diffusion_models/PixArt-Sigma",
|
||||
"description": "PixArt-Sigma Diffusion model",
|
||||
"reference": "https://huggingface.co/PixArt-alpha/PixArt-Sigma-XL-2-512-MS",
|
||||
"filename": "PixArt-Sigma-XL-2-512-MS.safetensors",
|
||||
"url": "https://huggingface.co/PixArt-alpha/PixArt-Sigma-XL-2-512-MS/resolve/main/transformer/diffusion_pytorch_model.safetensors",
|
||||
"size": "2.44GB"
|
||||
},
|
||||
{
|
||||
"name": "PixArt-Sigma-XL-2-1024-MS.safetensors (diffusion)",
|
||||
"type": "diffusion_model",
|
||||
"base": "pixart-sigma",
|
||||
"save_path": "diffusion_models/PixArt-Sigma",
|
||||
"description": "PixArt-Sigma Diffusion model",
|
||||
"reference": "https://huggingface.co/PixArt-alpha/PixArt-Sigma-XL-2-1024-MS",
|
||||
"filename": "PixArt-Sigma-XL-2-1024-MS.safetensors",
|
||||
"url": "https://huggingface.co/PixArt-alpha/PixArt-Sigma-XL-2-1024-MS/resolve/main/transformer/diffusion_pytorch_model.safetensors",
|
||||
"size": "2.44GB"
|
||||
},
|
||||
{
|
||||
"name": "PixArt-XL-2-1024-MS.safetensors (diffusion)",
|
||||
"type": "diffusion_model",
|
||||
"base": "pixart-alpha",
|
||||
"save_path": "diffusion_models/PixArt-Alpha",
|
||||
"description": "PixArt-Alpha Diffusion model",
|
||||
"reference": "https://huggingface.co/PixArt-alpha/PixArt-XL-2-1024-MS",
|
||||
"filename": "PixArt-XL-2-1024-MS.safetensors",
|
||||
"url": "https://huggingface.co/PixArt-alpha/PixArt-XL-2-1024-MS/resolve/main/transformer/diffusion_pytorch_model.safetensors",
|
||||
"size": "2.45GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/hunyuan_video_t2v_720p_bf16.safetensors",
|
||||
"type": "diffusion_model",
|
||||
"base": "Hunyuan Video",
|
||||
"save_path": "diffusion_models/hunyuan_video",
|
||||
"description": "Huyuan Video diffusion model. repackaged version.",
|
||||
"reference": "https://huggingface.co/Comfy-Org/HunyuanVideo_repackaged",
|
||||
"filename": "hunyuan_video_t2v_720p_bf16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/HunyuanVideo_repackaged/resolve/main/split_files/diffusion_models/hunyuan_video_t2v_720p_bf16.safetensors",
|
||||
"size": "25.6GB"
|
||||
},
|
||||
{
|
||||
"name": "Comfy-Org/hunyuan_video_vae_bf16.safetensors",
|
||||
"type": "VAE",
|
||||
"base": "Hunyuan Video",
|
||||
"save_path": "VAE",
|
||||
"description": "Huyuan Video VAE model. repackaged version.",
|
||||
"reference": "https://huggingface.co/Comfy-Org/HunyuanVideo_repackaged",
|
||||
"filename": "hunyuan_video_vae_bf16.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/HunyuanVideo_repackaged/resolve/main/split_files/vae/hunyuan_video_vae_bf16.safetensors",
|
||||
"size": "493MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "LTX-Video 2B v0.9.1 Checkpoint",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "LTX-Video is the first DiT-based video generation model capable of generating high-quality videos in real-time. It produces 24 FPS videos at a 768x512 resolution faster than they can be watched. Trained on a large-scale dataset of diverse videos, the model generates high-resolution videos with realistic and varied content.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltx-video-2b-v0.9.1.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltx-video-2b-v0.9.1.safetensors",
|
||||
"size": "5.72GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "XLabs-AI/flux-canny-controlnet-v3.safetensors",
|
||||
"type": "controlnet",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/controlnets",
|
||||
"description": "ControlNet checkpoints for FLUX.1-dev model by Black Forest Labs.",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-controlnet-collections",
|
||||
"filename": "flux-canny-controlnet-v3.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-controlnet-collections/resolve/main/flux-canny-controlnet-v3.safetensors",
|
||||
"size": "1.49GB"
|
||||
},
|
||||
{
|
||||
"name": "XLabs-AI/flux-depth-controlnet-v3.safetensors",
|
||||
"type": "controlnet",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/controlnets",
|
||||
"description": "ControlNet checkpoints for FLUX.1-dev model by Black Forest Labs.",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-controlnet-collections",
|
||||
"filename": "flux-depth-controlnet-v3.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-controlnet-collections/resolve/main/flux-depth-controlnet-v3.safetensors",
|
||||
"size": "1.49GB"
|
||||
},
|
||||
{
|
||||
"name": "XLabs-AI/flux-hed-controlnet-v3.safetensors",
|
||||
"type": "controlnet",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/controlnets",
|
||||
"description": "ControlNet checkpoints for FLUX.1-dev model by Black Forest Labs.",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-controlnet-collections",
|
||||
"filename": "flux-hed-controlnet-v3.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-controlnet-collections/resolve/main/flux-hed-controlnet-v3.safetensors",
|
||||
"size": "1.49GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "XLabs-AI/realism_lora.safetensors",
|
||||
"type": "lora",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/loras",
|
||||
"description": "A checkpoint with trained LoRAs for FLUX.1-dev model by Black Forest Labs",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-lora-collection",
|
||||
"filename": "realism_lora.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-lora-collection/resolve/main/realism_lora.safetensors",
|
||||
"size": "44.8MB"
|
||||
},
|
||||
{
|
||||
"name": "XLabs-AI/art_lora.safetensors",
|
||||
"type": "lora",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/loras",
|
||||
"description": "A checkpoint with trained LoRAs for FLUX.1-dev model by Black Forest Labs",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-lora-collection",
|
||||
"filename": "art_lora.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-lora-collection/resolve/main/scenery_lora.safetensors",
|
||||
"size": "44.8MB"
|
||||
},
|
||||
{
|
||||
"name": "XLabs-AI/mjv6_lora.safetensors",
|
||||
"type": "lora",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/loras",
|
||||
"description": "A checkpoint with trained LoRAs for FLUX.1-dev model by Black Forest Labs",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-lora-collection",
|
||||
"filename": "mjv6_lora.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-lora-collection/resolve/main/mjv6_lora.safetensors",
|
||||
"size": "44.8MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "XLabs-AI/flux-ip-adapter",
|
||||
"type": "lora",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "xlabs/ipadapters",
|
||||
"description": "A checkpoint with trained LoRAs for FLUX.1-dev model by Black Forest Labs",
|
||||
"reference": "https://huggingface.co/XLabs-AI/flux-ip-adapter",
|
||||
"filename": "ip_adapter.safetensors",
|
||||
"url": "https://huggingface.co/XLabs-AI/flux-ip-adapter/resolve/main/ip_adapter.safetensors",
|
||||
"size": "982MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "stabilityai/SD3.5-Large-Controlnet-Blur",
|
||||
"type": "controlnet",
|
||||
"base": "SD3.5",
|
||||
"save_path": "controlnet/SD3.5",
|
||||
"description": "Blur Controlnet model for SD3.5 Large",
|
||||
"reference": "https://huggingface.co/stabilityai/stable-diffusion-3.5-controlnets",
|
||||
"filename": "sd3.5_large_controlnet_blur.safetensors",
|
||||
"url": "https://huggingface.co/stabilityai/stable-diffusion-3.5-controlnets/resolve/main/sd3.5_large_controlnet_blur.safetensors",
|
||||
"size": "8.65GB"
|
||||
},
|
||||
{
|
||||
"name": "stabilityai/SD3.5-Large-Controlnet-Canny",
|
||||
"type": "controlnet",
|
||||
"base": "SD3.5",
|
||||
"save_path": "controlnet/SD3.5",
|
||||
"description": "Canny Controlnet model for SD3.5 Large",
|
||||
"reference": "https://huggingface.co/stabilityai/stable-diffusion-3.5-controlnets",
|
||||
"filename": "sd3.5_large_controlnet_canny.safetensors",
|
||||
"url": "https://huggingface.co/stabilityai/stable-diffusion-3.5-controlnets/resolve/main/sd3.5_large_controlnet_canny.safetensors",
|
||||
"size": "8.65GB"
|
||||
},
|
||||
{
|
||||
"name": "stabilityai/SD3.5-Large-Controlnet-Depth",
|
||||
"type": "controlnet",
|
||||
"base": "SD3.5",
|
||||
"save_path": "controlnet/SD3.5",
|
||||
"description": "Depth Controlnet model for SD3.5 Large",
|
||||
"reference": "https://huggingface.co/stabilityai/stable-diffusion-3.5-controlnets",
|
||||
"filename": "sd3.5_large_controlnet_depth.safetensors",
|
||||
"url": "https://huggingface.co/stabilityai/stable-diffusion-3.5-controlnets/resolve/main/sd3.5_large_controlnet_depth.safetensors",
|
||||
"size": "8.65GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "LTX-Video 2B v0.9 Checkpoint",
|
||||
"type": "checkpoint",
|
||||
"base": "LTX-Video",
|
||||
"save_path": "checkpoints/LTXV",
|
||||
"description": "LTX-Video is the first DiT-based video generation model capable of generating high-quality videos in real-time. It produces 24 FPS videos at a 768x512 resolution faster than they can be watched. Trained on a large-scale dataset of diverse videos, the model generates high-resolution videos with realistic and varied content.",
|
||||
"reference": "https://huggingface.co/Lightricks/LTX-Video",
|
||||
"filename": "ltx-video-2b-v0.9.safetensors",
|
||||
"url": "https://huggingface.co/Lightricks/LTX-Video/resolve/main/ltx-video-2b-v0.9.safetensors",
|
||||
"size": "9.37GB"
|
||||
},
|
||||
{
|
||||
"name": "InstantX/FLUX.1-dev-IP-Adapter",
|
||||
"type": "IP-Adapter",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "ipadapter-flux",
|
||||
"description": "FLUX.1-dev-IP-Adapter",
|
||||
"reference": "https://huggingface.co/InstantX/FLUX.1-dev-IP-Adapter",
|
||||
"filename": "ip-adapter.bin",
|
||||
"url": "https://huggingface.co/InstantX/FLUX.1-dev-IP-Adapter/resolve/main/ip-adapter.bin",
|
||||
"size": "5.29GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Comfy-Org/sigclip_vision_384 (patch14_384)",
|
||||
"type": "clip_vision",
|
||||
"base": "sigclip",
|
||||
"save_path": "clip_vision",
|
||||
"description": "This clip vision model is required for FLUX.1 Redux.",
|
||||
"reference": "https://huggingface.co/Comfy-Org/sigclip_vision_384/tree/main",
|
||||
"filename": "sigclip_vision_patch14_384.safetensors",
|
||||
"url": "https://huggingface.co/Comfy-Org/sigclip_vision_384/resolve/main/sigclip_vision_patch14_384.safetensors",
|
||||
"size": "857MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "comfyanonymous/flux_text_encoders - t5xxl (fp16)",
|
||||
"type": "clip",
|
||||
"base": "t5",
|
||||
"save_path": "text_encoders/t5",
|
||||
"description": "Text Encoders for FLUX (fp16)",
|
||||
"reference": "https://huggingface.co/comfyanonymous/flux_text_encoders",
|
||||
"filename": "t5xxl_fp16.safetensors",
|
||||
"url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp16.safetensors",
|
||||
"size": "9.79GB"
|
||||
},
|
||||
{
|
||||
"name": "comfyanonymous/flux_text_encoders - t5xxl (fp8_e4m3fn)",
|
||||
"type": "clip",
|
||||
"base": "t5",
|
||||
"save_path": "text_encoders/t5",
|
||||
"description": "Text Encoders for FLUX (fp8_e4m3fn)",
|
||||
"reference": "https://huggingface.co/comfyanonymous/flux_text_encoders",
|
||||
"filename": "t5xxl_fp8_e4m3fn.safetensors",
|
||||
"url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp8_e4m3fn.safetensors",
|
||||
"size": "4.89GB"
|
||||
},
|
||||
{
|
||||
"name": "comfyanonymous/flux_text_encoders - t5xxl (fp8_e4m3fn_scaled)",
|
||||
"type": "clip",
|
||||
"base": "t5",
|
||||
"save_path": "text_encoders/t5",
|
||||
"description": "Text Encoders for FLUX (fp16)",
|
||||
"reference": "https://huggingface.co/comfyanonymous/flux_text_encoders",
|
||||
"filename": "t5xxl_fp8_e4m3fn_scaled.safetensors",
|
||||
"url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp8_e4m3fn_scaled.safetensors",
|
||||
"size": "5.16GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "FLUX.1 [Dev] Diffusion model (scaled fp8)",
|
||||
"type": "diffusion_model",
|
||||
"base": "FLUX.1",
|
||||
"save_path": "diffusion_models/FLUX1",
|
||||
"description": "FLUX.1 [Dev] Diffusion model (scaled fp8)[w/Due to the large size of the model, it is recommended to download it through a browser if possible.]",
|
||||
"reference": "https://huggingface.co/comfyanonymous/flux_dev_scaled_fp8_test",
|
||||
"filename": "flux_dev_fp8_scaled_diffusion_model.safetensors",
|
||||
"url": "https://huggingface.co/comfyanonymous/flux_dev_scaled_fp8_test/resolve/main/flux_dev_fp8_scaled_diffusion_model.safetensors",
|
||||
"size": "11.9GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "kijai/MoGe_ViT_L_fp16.safetensors",
|
||||
"type": "MoGe",
|
||||
"base": "MoGe",
|
||||
"save_path": "MoGe",
|
||||
"description": "Safetensors versions of [a/https://github.com/microsoft/MoGe](https://github.com/microsoft/MoGe)",
|
||||
"reference": "https://huggingface.co/Kijai/MoGe_safetensors",
|
||||
"filename": "MoGe_ViT_L_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Kijai/MoGe_safetensors/resolve/main/MoGe_ViT_L_fp16.safetensors",
|
||||
"size": "628MB"
|
||||
},
|
||||
{
|
||||
"name": "kijai/MoGe_ViT_L_fp16.safetensors",
|
||||
"type": "MoGe",
|
||||
"base": "MoGe",
|
||||
"save_path": "MoGe",
|
||||
"description": "Safetensors versions of [a/https://github.com/microsoft/MoGe](https://github.com/microsoft/MoGe)",
|
||||
"reference": "https://huggingface.co/Kijai/MoGe_safetensors",
|
||||
"filename": "MoGe_ViT_L_fp16.safetensors",
|
||||
"url": "https://huggingface.co/Kijai/MoGe_safetensors/resolve/main/MoGe_ViT_L_fp16.safetensors",
|
||||
"size": "1.26GB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "pulid_flux_v0.9.1.safetensors",
|
||||
"type": "PuLID",
|
||||
"base": "FLUX",
|
||||
"save_path": "pulid",
|
||||
"description": "This is required for PuLID (FLUX)",
|
||||
"reference": "https://huggingface.co/guozinan/PuLID",
|
||||
"filename": "pulid_flux_v0.9.1.safetensors",
|
||||
"url": "https://huggingface.co/guozinan/PuLID/resolve/main/pulid_flux_v0.9.1.safetensors",
|
||||
"size": "1.14GB"
|
||||
},
|
||||
{
|
||||
"name": "pulid_v1.1.safetensors",
|
||||
"type": "PuLID",
|
||||
"base": "SDXL",
|
||||
"save_path": "pulid",
|
||||
"description": "This is required for PuLID (SDXL)",
|
||||
"reference": "https://huggingface.co/guozinan/PuLID",
|
||||
"filename": "pulid_v1.1.safetensors",
|
||||
"url": "https://huggingface.co/guozinan/PuLID/resolve/main/pulid_v1.1.safetensors",
|
||||
"size": "984MB"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "Kolors-IP-Adapter-Plus.bin (Kwai-Kolors/Kolors-IP-Adapter-Plus)",
|
||||
"type": "IP-Adapter",
|
||||
"base": "Kolors",
|
||||
"save_path": "ipadapter",
|
||||
"description": "You can use this model in the [a/ComfyUI IPAdapter plus](https://github.com/cubiq/ComfyUI_IPAdapter_plus) extension.",
|
||||
"reference": "https://huggingface.co/Kwai-Kolors/Kolors-IP-Adapter-Plus",
|
||||
"filename": "Kolors-IP-Adapter-Plus.bin",
|
||||
"url": "https://huggingface.co/Kwai-Kolors/Kolors-IP-Adapter-Plus/resolve/main/ip_adapter_plus_general.bin",
|
||||
"size": "1.01GB"
|
||||
},
|
||||
{
|
||||
"name": "Kolors-IP-Adapter-FaceID-Plus.bin (Kwai-Kolors/Kolors-IP-Adapter-Plus)",
|
||||
"type": "IP-Adapter",
|
||||
"base": "Kolors",
|
||||
"save_path": "ipadapter",
|
||||
"description": "You can use this model in the [a/ComfyUI IPAdapter plus](https://github.com/cubiq/ComfyUI_IPAdapter_plus) extension.",
|
||||
"reference": "https://huggingface.co/Kwai-Kolors/Kolors-IP-Adapter-FaceID-Plus",
|
||||
"filename": "Kolors-IP-Adapter-FaceID-Plus.bin",
|
||||
"url": "https://huggingface.co/Kwai-Kolors/Kolors-IP-Adapter-FaceID-Plus/resolve/main/ipa-faceid-plus.bin",
|
||||
"size": "2.39GB"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -1,5 +1,25 @@
|
||||
{
|
||||
"custom_nodes": [
|
||||
{
|
||||
"author": "Comfy-Org",
|
||||
"title": "ComfyUI React Extension Template",
|
||||
"reference": "https://github.com/Comfy-Org/ComfyUI-React-Extension-Template",
|
||||
"files": [
|
||||
"https://github.com/Comfy-Org/ComfyUI-React-Extension-Template"
|
||||
],
|
||||
"install_type": "git-clone",
|
||||
"description": "A minimal template for creating React/TypeScript frontend extensions for ComfyUI, with complete boilerplate setup including internationalization and unit testing."
|
||||
},
|
||||
{
|
||||
"author": "comfyui-wiki",
|
||||
"title": "ComfyUI-i18n-demo",
|
||||
"reference": "https://github.com/comfyui-wiki/ComfyUI-i18n-demo",
|
||||
"files": [
|
||||
"https://github.com/comfyui-wiki/ComfyUI-i18n-demo"
|
||||
],
|
||||
"install_type": "git-clone",
|
||||
"description": "ComfyUI custom node develop i18n support demo "
|
||||
},
|
||||
{
|
||||
"author": "Suzie1",
|
||||
"title": "Guide To Making Custom Nodes in ComfyUI",
|
||||
@@ -321,6 +341,26 @@
|
||||
],
|
||||
"description": "Dynamic Node examples for ComfyUI",
|
||||
"install_type": "git-clone"
|
||||
},
|
||||
{
|
||||
"author": "Jonathon-Doran",
|
||||
"title": "remote-combo-demo",
|
||||
"reference": "https://github.com/Jonathon-Doran/remote-combo-demo",
|
||||
"files": [
|
||||
"https://github.com/Jonathon-Doran/remote-combo-demo"
|
||||
],
|
||||
"install_type": "git-clone",
|
||||
"description": "A minimal test suite demonstrating how remote COMBO inputs behave in ComfyUI, with and without force_input"
|
||||
},
|
||||
{
|
||||
"author": "J1mB091",
|
||||
"title": "ComfyUI-J1mB091 Custom Nodes",
|
||||
"reference": "https://github.com/J1mB091/ComfyUI-J1mB091",
|
||||
"files": [
|
||||
"https://github.com/J1mB091/ComfyUI-J1mB091"
|
||||
],
|
||||
"install_type": "git-clone",
|
||||
"description": "Vibe Coded ComfyUI Custom Nodes"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,373 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "aaaaaaaaaa"
|
||||
},
|
||||
"source": [
|
||||
"Git clone the repo and install the requirements. (ignore the pip errors about protobuf)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "bbbbbbbbbb"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# #@title Environment Setup\n",
|
||||
"\n",
|
||||
"from pathlib import Path\n",
|
||||
"\n",
|
||||
"OPTIONS = {}\n",
|
||||
"\n",
|
||||
"USE_GOOGLE_DRIVE = True #@param {type:\"boolean\"}\n",
|
||||
"UPDATE_COMFY_UI = True #@param {type:\"boolean\"}\n",
|
||||
"USE_COMFYUI_MANAGER = True #@param {type:\"boolean\"}\n",
|
||||
"INSTALL_CUSTOM_NODES_DEPENDENCIES = True #@param {type:\"boolean\"}\n",
|
||||
"OPTIONS['USE_GOOGLE_DRIVE'] = USE_GOOGLE_DRIVE\n",
|
||||
"OPTIONS['UPDATE_COMFY_UI'] = UPDATE_COMFY_UI\n",
|
||||
"OPTIONS['USE_COMFYUI_MANAGER'] = USE_COMFYUI_MANAGER\n",
|
||||
"OPTIONS['INSTALL_CUSTOM_NODES_DEPENDENCIES'] = INSTALL_CUSTOM_NODES_DEPENDENCIES\n",
|
||||
"\n",
|
||||
"current_dir = !pwd\n",
|
||||
"WORKSPACE = f\"{current_dir[0]}/ComfyUI\"\n",
|
||||
"\n",
|
||||
"if OPTIONS['USE_GOOGLE_DRIVE']:\n",
|
||||
" !echo \"Mounting Google Drive...\"\n",
|
||||
" %cd /\n",
|
||||
"\n",
|
||||
" from google.colab import drive\n",
|
||||
" drive.mount('/content/drive')\n",
|
||||
"\n",
|
||||
" WORKSPACE = \"/content/drive/MyDrive/ComfyUI\"\n",
|
||||
" %cd /content/drive/MyDrive\n",
|
||||
"\n",
|
||||
"![ ! -d $WORKSPACE ] && echo -= Initial setup ComfyUI =- && git clone https://github.com/comfyanonymous/ComfyUI\n",
|
||||
"%cd $WORKSPACE\n",
|
||||
"\n",
|
||||
"if OPTIONS['UPDATE_COMFY_UI']:\n",
|
||||
" !echo -= Updating ComfyUI =-\n",
|
||||
"\n",
|
||||
" # Correction of the issue of permissions being deleted on Google Drive.\n",
|
||||
" ![ -f \".ci/nightly/update_windows/update_comfyui_and_python_dependencies.bat\" ] && chmod 755 .ci/nightly/update_windows/update_comfyui_and_python_dependencies.bat\n",
|
||||
" ![ -f \".ci/nightly/windows_base_files/run_nvidia_gpu.bat\" ] && chmod 755 .ci/nightly/windows_base_files/run_nvidia_gpu.bat\n",
|
||||
" ![ -f \".ci/update_windows/update_comfyui_and_python_dependencies.bat\" ] && chmod 755 .ci/update_windows/update_comfyui_and_python_dependencies.bat\n",
|
||||
" ![ -f \".ci/update_windows_cu118/update_comfyui_and_python_dependencies.bat\" ] && chmod 755 .ci/update_windows_cu118/update_comfyui_and_python_dependencies.bat\n",
|
||||
" ![ -f \".ci/update_windows/update.py\" ] && chmod 755 .ci/update_windows/update.py\n",
|
||||
" ![ -f \".ci/update_windows/update_comfyui.bat\" ] && chmod 755 .ci/update_windows/update_comfyui.bat\n",
|
||||
" ![ -f \".ci/update_windows/README_VERY_IMPORTANT.txt\" ] && chmod 755 .ci/update_windows/README_VERY_IMPORTANT.txt\n",
|
||||
" ![ -f \".ci/update_windows/run_cpu.bat\" ] && chmod 755 .ci/update_windows/run_cpu.bat\n",
|
||||
" ![ -f \".ci/update_windows/run_nvidia_gpu.bat\" ] && chmod 755 .ci/update_windows/run_nvidia_gpu.bat\n",
|
||||
"\n",
|
||||
" !git pull\n",
|
||||
"\n",
|
||||
"!echo -= Install dependencies =-\n",
|
||||
"!pip3 install accelerate\n",
|
||||
"!pip3 install einops transformers>=4.28.1 safetensors>=0.4.2 aiohttp pyyaml Pillow scipy tqdm psutil tokenizers>=0.13.3\n",
|
||||
"!pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121\n",
|
||||
"!pip3 install torchsde\n",
|
||||
"!pip3 install kornia>=0.7.1 spandrel soundfile sentencepiece\n",
|
||||
"\n",
|
||||
"if OPTIONS['USE_COMFYUI_MANAGER']:\n",
|
||||
" %cd custom_nodes\n",
|
||||
"\n",
|
||||
" # Correction of the issue of permissions being deleted on Google Drive.\n",
|
||||
" ![ -f \"ComfyUI-Manager/check.sh\" ] && chmod 755 ComfyUI-Manager/check.sh\n",
|
||||
" ![ -f \"ComfyUI-Manager/scan.sh\" ] && chmod 755 ComfyUI-Manager/scan.sh\n",
|
||||
" ![ -f \"ComfyUI-Manager/node_db/dev/scan.sh\" ] && chmod 755 ComfyUI-Manager/node_db/dev/scan.sh\n",
|
||||
" ![ -f \"ComfyUI-Manager/node_db/tutorial/scan.sh\" ] && chmod 755 ComfyUI-Manager/node_db/tutorial/scan.sh\n",
|
||||
" ![ -f \"ComfyUI-Manager/scripts/install-comfyui-venv-linux.sh\" ] && chmod 755 ComfyUI-Manager/scripts/install-comfyui-venv-linux.sh\n",
|
||||
" ![ -f \"ComfyUI-Manager/scripts/install-comfyui-venv-win.bat\" ] && chmod 755 ComfyUI-Manager/scripts/install-comfyui-venv-win.bat\n",
|
||||
"\n",
|
||||
" ![ ! -d ComfyUI-Manager ] && echo -= Initial setup ComfyUI-Manager =- && git clone https://github.com/ltdrdata/ComfyUI-Manager\n",
|
||||
" %cd ComfyUI-Manager\n",
|
||||
" !git pull\n",
|
||||
"\n",
|
||||
"%cd $WORKSPACE\n",
|
||||
"\n",
|
||||
"if OPTIONS['INSTALL_CUSTOM_NODES_DEPENDENCIES']:\n",
|
||||
" !echo -= Install custom nodes dependencies =-\n",
|
||||
" !pip install GitPython\n",
|
||||
" !python custom_nodes/ComfyUI-Manager/cm-cli.py restore-dependencies\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "cccccccccc"
|
||||
},
|
||||
"source": [
|
||||
"Download some models/checkpoints/vae or custom comfyui nodes (uncomment the commands for the ones you want)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "dddddddddd"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Checkpoints\n",
|
||||
"\n",
|
||||
"### SDXL\n",
|
||||
"### I recommend these workflow examples: https://comfyanonymous.github.io/ComfyUI_examples/sdxl/\n",
|
||||
"\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0/resolve/main/sd_xl_base_1.0.safetensors -P ./models/checkpoints/\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/stable-diffusion-xl-refiner-1.0/resolve/main/sd_xl_refiner_1.0.safetensors -P ./models/checkpoints/\n",
|
||||
"\n",
|
||||
"# SDXL ReVision\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/clip_vision_g/resolve/main/clip_vision_g.safetensors -P ./models/clip_vision/\n",
|
||||
"\n",
|
||||
"# SD1.5\n",
|
||||
"!wget -c https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned-emaonly.ckpt -P ./models/checkpoints/\n",
|
||||
"\n",
|
||||
"# SD2\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/stable-diffusion-2-1-base/resolve/main/v2-1_512-ema-pruned.safetensors -P ./models/checkpoints/\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/stable-diffusion-2-1/resolve/main/v2-1_768-ema-pruned.safetensors -P ./models/checkpoints/\n",
|
||||
"\n",
|
||||
"# Some SD1.5 anime style\n",
|
||||
"#!wget -c https://huggingface.co/WarriorMama777/OrangeMixs/resolve/main/Models/AbyssOrangeMix2/AbyssOrangeMix2_hard.safetensors -P ./models/checkpoints/\n",
|
||||
"#!wget -c https://huggingface.co/WarriorMama777/OrangeMixs/resolve/main/Models/AbyssOrangeMix3/AOM3A1_orangemixs.safetensors -P ./models/checkpoints/\n",
|
||||
"#!wget -c https://huggingface.co/WarriorMama777/OrangeMixs/resolve/main/Models/AbyssOrangeMix3/AOM3A3_orangemixs.safetensors -P ./models/checkpoints/\n",
|
||||
"#!wget -c https://huggingface.co/Linaqruf/anything-v3.0/resolve/main/anything-v3-fp16-pruned.safetensors -P ./models/checkpoints/\n",
|
||||
"\n",
|
||||
"# Waifu Diffusion 1.5 (anime style SD2.x 768-v)\n",
|
||||
"#!wget -c https://huggingface.co/waifu-diffusion/wd-1-5-beta3/resolve/main/wd-illusion-fp16.safetensors -P ./models/checkpoints/\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# unCLIP models\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/illuminatiDiffusionV1_v11_unCLIP/resolve/main/illuminatiDiffusionV1_v11-unclip-h-fp16.safetensors -P ./models/checkpoints/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/wd-1.5-beta2_unCLIP/resolve/main/wd-1-5-beta2-aesthetic-unclip-h-fp16.safetensors -P ./models/checkpoints/\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# VAE\n",
|
||||
"!wget -c https://huggingface.co/stabilityai/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.safetensors -P ./models/vae/\n",
|
||||
"#!wget -c https://huggingface.co/WarriorMama777/OrangeMixs/resolve/main/VAEs/orangemix.vae.pt -P ./models/vae/\n",
|
||||
"#!wget -c https://huggingface.co/hakurei/waifu-diffusion-v1-4/resolve/main/vae/kl-f8-anime2.ckpt -P ./models/vae/\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Loras\n",
|
||||
"#!wget -c https://civitai.com/api/download/models/10350 -O ./models/loras/theovercomer8sContrastFix_sd21768.safetensors #theovercomer8sContrastFix SD2.x 768-v\n",
|
||||
"#!wget -c https://civitai.com/api/download/models/10638 -O ./models/loras/theovercomer8sContrastFix_sd15.safetensors #theovercomer8sContrastFix SD1.x\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0/resolve/main/sd_xl_offset_example-lora_1.0.safetensors -P ./models/loras/ #SDXL offset noise lora\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# T2I-Adapter\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_depth_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_seg_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_sketch_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_keypose_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_openpose_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_color_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_canny_sd14v1.pth -P ./models/controlnet/\n",
|
||||
"\n",
|
||||
"# T2I Styles Model\n",
|
||||
"#!wget -c https://huggingface.co/TencentARC/T2I-Adapter/resolve/main/models/t2iadapter_style_sd14v1.pth -P ./models/style_models/\n",
|
||||
"\n",
|
||||
"# CLIPVision model (needed for styles model)\n",
|
||||
"#!wget -c https://huggingface.co/openai/clip-vit-large-patch14/resolve/main/pytorch_model.bin -O ./models/clip_vision/clip_vit14.bin\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# ControlNet\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11e_sd15_ip2p_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11e_sd15_shuffle_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_canny_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11f1p_sd15_depth_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_inpaint_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_lineart_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_mlsd_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_normalbae_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_openpose_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_scribble_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_seg_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15_softedge_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11p_sd15s2_lineart_anime_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/ControlNet-v1-1_fp16_safetensors/resolve/main/control_v11u_sd15_tile_fp16.safetensors -P ./models/controlnet/\n",
|
||||
"\n",
|
||||
"# ControlNet SDXL\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/control-lora/resolve/main/control-LoRAs-rank256/control-lora-canny-rank256.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/control-lora/resolve/main/control-LoRAs-rank256/control-lora-depth-rank256.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/control-lora/resolve/main/control-LoRAs-rank256/control-lora-recolor-rank256.safetensors -P ./models/controlnet/\n",
|
||||
"#!wget -c https://huggingface.co/stabilityai/control-lora/resolve/main/control-LoRAs-rank256/control-lora-sketch-rank256.safetensors -P ./models/controlnet/\n",
|
||||
"\n",
|
||||
"# Controlnet Preprocessor nodes by Fannovel16\n",
|
||||
"#!cd custom_nodes && git clone https://github.com/Fannovel16/comfy_controlnet_preprocessors; cd comfy_controlnet_preprocessors && python install.py\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# GLIGEN\n",
|
||||
"#!wget -c https://huggingface.co/comfyanonymous/GLIGEN_pruned_safetensors/resolve/main/gligen_sd14_textbox_pruned_fp16.safetensors -P ./models/gligen/\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# ESRGAN upscale model\n",
|
||||
"#!wget -c https://github.com/xinntao/Real-ESRGAN/releases/download/v0.1.0/RealESRGAN_x4plus.pth -P ./models/upscale_models/\n",
|
||||
"#!wget -c https://huggingface.co/sberbank-ai/Real-ESRGAN/resolve/main/RealESRGAN_x2.pth -P ./models/upscale_models/\n",
|
||||
"#!wget -c https://huggingface.co/sberbank-ai/Real-ESRGAN/resolve/main/RealESRGAN_x4.pth -P ./models/upscale_models/\n",
|
||||
"\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "kkkkkkkkkkkkkkk"
|
||||
},
|
||||
"source": [
|
||||
"### Run ComfyUI with cloudflared (Recommended Way)\n",
|
||||
"\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "jjjjjjjjjjjjjj"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"!wget -P ~ https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-amd64.deb\n",
|
||||
"!dpkg -i ~/cloudflared-linux-amd64.deb\n",
|
||||
"\n",
|
||||
"import subprocess\n",
|
||||
"import threading\n",
|
||||
"import time\n",
|
||||
"import socket\n",
|
||||
"import urllib.request\n",
|
||||
"\n",
|
||||
"def iframe_thread(port):\n",
|
||||
" while True:\n",
|
||||
" time.sleep(0.5)\n",
|
||||
" sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n",
|
||||
" result = sock.connect_ex(('127.0.0.1', port))\n",
|
||||
" if result == 0:\n",
|
||||
" break\n",
|
||||
" sock.close()\n",
|
||||
" print(\"\\nComfyUI finished loading, trying to launch cloudflared (if it gets stuck here cloudflared is having issues)\\n\")\n",
|
||||
"\n",
|
||||
" p = subprocess.Popen([\"cloudflared\", \"tunnel\", \"--url\", \"http://127.0.0.1:{}\".format(port)], stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n",
|
||||
" for line in p.stderr:\n",
|
||||
" l = line.decode()\n",
|
||||
" if \"trycloudflare.com \" in l:\n",
|
||||
" print(\"This is the URL to access ComfyUI:\", l[l.find(\"http\"):], end='')\n",
|
||||
" #print(l, end='')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"threading.Thread(target=iframe_thread, daemon=True, args=(8188,)).start()\n",
|
||||
"\n",
|
||||
"!python main.py --dont-print-server"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "kkkkkkkkkkkkkk"
|
||||
},
|
||||
"source": [
|
||||
"### Run ComfyUI with localtunnel\n",
|
||||
"\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "jjjjjjjjjjjjj"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"!npm install -g localtunnel\n",
|
||||
"\n",
|
||||
"import subprocess\n",
|
||||
"import threading\n",
|
||||
"import time\n",
|
||||
"import socket\n",
|
||||
"import urllib.request\n",
|
||||
"\n",
|
||||
"def iframe_thread(port):\n",
|
||||
" while True:\n",
|
||||
" time.sleep(0.5)\n",
|
||||
" sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n",
|
||||
" result = sock.connect_ex(('127.0.0.1', port))\n",
|
||||
" if result == 0:\n",
|
||||
" break\n",
|
||||
" sock.close()\n",
|
||||
" print(\"\\nComfyUI finished loading, trying to launch localtunnel (if it gets stuck here localtunnel is having issues)\\n\")\n",
|
||||
"\n",
|
||||
" print(\"The password/enpoint ip for localtunnel is:\", urllib.request.urlopen('https://ipv4.icanhazip.com').read().decode('utf8').strip(\"\\n\"))\n",
|
||||
" p = subprocess.Popen([\"lt\", \"--port\", \"{}\".format(port)], stdout=subprocess.PIPE)\n",
|
||||
" for line in p.stdout:\n",
|
||||
" print(line.decode(), end='')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"threading.Thread(target=iframe_thread, daemon=True, args=(8188,)).start()\n",
|
||||
"\n",
|
||||
"!python main.py --dont-print-server"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "gggggggggg"
|
||||
},
|
||||
"source": [
|
||||
"### Run ComfyUI with colab iframe (use only in case the previous way with localtunnel doesn't work)\n",
|
||||
"\n",
|
||||
"You should see the ui appear in an iframe. If you get a 403 error, it's your firefox settings or an extension that's messing things up.\n",
|
||||
"\n",
|
||||
"If you want to open it in another window use the link.\n",
|
||||
"\n",
|
||||
"Note that some UI features like live image previews won't work because the colab iframe blocks websockets."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "hhhhhhhhhh"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import threading\n",
|
||||
"import time\n",
|
||||
"import socket\n",
|
||||
"def iframe_thread(port):\n",
|
||||
" while True:\n",
|
||||
" time.sleep(0.5)\n",
|
||||
" sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n",
|
||||
" result = sock.connect_ex(('127.0.0.1', port))\n",
|
||||
" if result == 0:\n",
|
||||
" break\n",
|
||||
" sock.close()\n",
|
||||
" from google.colab import output\n",
|
||||
" output.serve_kernel_port_as_iframe(port, height=1024)\n",
|
||||
" print(\"to open it in a window you can open this link here:\")\n",
|
||||
" output.serve_kernel_port_as_window(port)\n",
|
||||
"\n",
|
||||
"threading.Thread(target=iframe_thread, daemon=True, args=(8188,)).start()\n",
|
||||
"\n",
|
||||
"!python main.py --dont-print-server"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"accelerator": "GPU",
|
||||
"colab": {
|
||||
"provenance": []
|
||||
},
|
||||
"gpuClass": "standard",
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"name": "python"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0
|
||||
}
|
||||
1414
openapi.yaml
Normal file
1414
openapi.yaml
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,15 +1,65 @@
|
||||
[build-system]
|
||||
requires = ["setuptools >= 61.0"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "comfyui-manager"
|
||||
license = { text = "GPL-3.0-only" }
|
||||
version = "4.0.2"
|
||||
requires-python = ">= 3.9"
|
||||
description = "ComfyUI-Manager provides features to install and manage custom nodes for ComfyUI, as well as various functionalities to assist with ComfyUI."
|
||||
version = "3.31.12"
|
||||
license = { file = "LICENSE.txt" }
|
||||
dependencies = ["GitPython", "PyGithub", "matrix-client==0.4.0", "transformers", "huggingface-hub>0.20", "typer", "rich", "typing-extensions", "toml", "uv", "chardet"]
|
||||
readme = "README.md"
|
||||
keywords = ["comfyui", "comfyui-manager"]
|
||||
|
||||
maintainers = [
|
||||
{ name = "Dr.Lt.Data", email = "dr.lt.data@gmail.com" },
|
||||
{ name = "Yoland Yan", email = "yoland@comfy.org" },
|
||||
{ name = "James Kwon", email = "hongilkwon316@gmail.com" },
|
||||
{ name = "Robin Huang", email = "robin@comfy.org" },
|
||||
]
|
||||
|
||||
classifiers = [
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
"Intended Audience :: Developers",
|
||||
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
|
||||
]
|
||||
|
||||
dependencies = [
|
||||
"GitPython",
|
||||
"PyGithub",
|
||||
# "matrix-nio",
|
||||
"transformers",
|
||||
"huggingface-hub>0.20",
|
||||
"typer",
|
||||
"rich",
|
||||
"typing-extensions",
|
||||
"toml",
|
||||
"uv",
|
||||
"chardet"
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = ["pre-commit", "pytest", "ruff", "pytest-cov"]
|
||||
|
||||
[project.urls]
|
||||
Repository = "https://github.com/ltdrdata/ComfyUI-Manager"
|
||||
# Used by Comfy Registry https://comfyregistry.org
|
||||
|
||||
[tool.comfy]
|
||||
PublisherId = "drltdata"
|
||||
DisplayName = "ComfyUI-Manager"
|
||||
Icon = ""
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["."]
|
||||
include = ["comfyui_manager*"]
|
||||
|
||||
[project.scripts]
|
||||
cm-cli = "comfyui_manager.cm_cli.__main__:main"
|
||||
|
||||
[tool.ruff]
|
||||
line-length = 120
|
||||
target-version = "py39"
|
||||
|
||||
[tool.ruff.lint]
|
||||
select = [
|
||||
"E4", # default
|
||||
"E7", # default
|
||||
"E9", # default
|
||||
"F", # default
|
||||
"I", # isort-like behavior (import statement sorting)
|
||||
]
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
GitPython
|
||||
PyGithub
|
||||
matrix-client==0.4.0
|
||||
# matrix-nio
|
||||
transformers
|
||||
huggingface-hub>0.20
|
||||
typer
|
||||
|
||||
29
scanner.py
29
scanner.py
@@ -94,7 +94,7 @@ def extract_nodes(code_text):
|
||||
return s
|
||||
else:
|
||||
return set()
|
||||
except:
|
||||
except Exception:
|
||||
return set()
|
||||
|
||||
|
||||
@@ -102,12 +102,8 @@ def extract_nodes(code_text):
|
||||
def scan_in_file(filename, is_builtin=False):
|
||||
global builtin_nodes
|
||||
|
||||
try:
|
||||
with open(filename, encoding='utf-8') as file:
|
||||
code = file.read()
|
||||
except UnicodeDecodeError:
|
||||
with open(filename, encoding='cp949') as file:
|
||||
code = file.read()
|
||||
with open(filename, encoding='utf-8', errors='ignore') as file:
|
||||
code = file.read()
|
||||
|
||||
pattern = r"_CLASS_MAPPINGS\s*=\s*{([^}]*)}"
|
||||
regex = re.compile(pattern, re.MULTILINE | re.DOTALL)
|
||||
@@ -259,13 +255,13 @@ def clone_or_pull_git_repository(git_url):
|
||||
repo.git.submodule('update', '--init', '--recursive')
|
||||
print(f"Pulling {repo_name}...")
|
||||
except Exception as e:
|
||||
print(f"Pulling {repo_name} failed: {e}")
|
||||
print(f"Failed to pull '{repo_name}': {e}")
|
||||
else:
|
||||
try:
|
||||
Repo.clone_from(git_url, repo_dir, recursive=True)
|
||||
print(f"Cloning {repo_name}...")
|
||||
except Exception as e:
|
||||
print(f"Cloning {repo_name} failed: {e}")
|
||||
print(f"Failed to clone '{repo_name}': {e}")
|
||||
|
||||
|
||||
def update_custom_nodes():
|
||||
@@ -297,7 +293,7 @@ def update_custom_nodes():
|
||||
pass
|
||||
|
||||
def is_rate_limit_exceeded():
|
||||
return g.rate_limiting[0] == 0
|
||||
return g.rate_limiting[0] <= 20
|
||||
|
||||
if is_rate_limit_exceeded():
|
||||
print(f"GitHub API Rate Limit Exceeded: remained - {(g.rate_limiting_resettime - datetime.datetime.now().timestamp())/60:.2f} min")
|
||||
@@ -400,7 +396,7 @@ def update_custom_nodes():
|
||||
|
||||
try:
|
||||
download_url(url, temp_dir)
|
||||
except:
|
||||
except Exception:
|
||||
print(f"[ERROR] Cannot download '{url}'")
|
||||
|
||||
with concurrent.futures.ThreadPoolExecutor(10) as executor:
|
||||
@@ -500,8 +496,15 @@ def gen_json(node_info):
|
||||
nodes_in_url, metadata_in_url = data[git_url]
|
||||
nodes = set(nodes_in_url)
|
||||
|
||||
for x, desc in node_list_json.items():
|
||||
nodes.add(x.strip())
|
||||
try:
|
||||
for x, desc in node_list_json.items():
|
||||
nodes.add(x.strip())
|
||||
except Exception as e:
|
||||
print(f"\nERROR: Invalid json format '{node_list_json_path}'")
|
||||
print("------------------------------------------------------")
|
||||
print(e)
|
||||
print("------------------------------------------------------")
|
||||
node_list_json = {}
|
||||
|
||||
metadata_in_url['title_aux'] = title
|
||||
|
||||
|
||||
@@ -1,39 +0,0 @@
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
|
||||
def get_enabled_subdirectories_with_files(base_directory):
|
||||
subdirs_with_files = []
|
||||
for subdir in os.listdir(base_directory):
|
||||
try:
|
||||
full_path = os.path.join(base_directory, subdir)
|
||||
if os.path.isdir(full_path) and not subdir.endswith(".disabled") and not subdir.startswith('.') and subdir != '__pycache__':
|
||||
print(f"## Install dependencies for '{subdir}'")
|
||||
requirements_file = os.path.join(full_path, "requirements.txt")
|
||||
install_script = os.path.join(full_path, "install.py")
|
||||
|
||||
if os.path.exists(requirements_file) or os.path.exists(install_script):
|
||||
subdirs_with_files.append((full_path, requirements_file, install_script))
|
||||
except Exception as e:
|
||||
print(f"EXCEPTION During Dependencies INSTALL on '{subdir}':\n{e}")
|
||||
|
||||
return subdirs_with_files
|
||||
|
||||
|
||||
def install_requirements(requirements_file_path):
|
||||
if os.path.exists(requirements_file_path):
|
||||
subprocess.run(["pip", "install", "-r", requirements_file_path])
|
||||
|
||||
|
||||
def run_install_script(install_script_path):
|
||||
if os.path.exists(install_script_path):
|
||||
subprocess.run(["python", install_script_path])
|
||||
|
||||
|
||||
custom_nodes_directory = "custom_nodes"
|
||||
subdirs_with_files = get_enabled_subdirectories_with_files(custom_nodes_directory)
|
||||
|
||||
|
||||
for subdir, requirements_file, install_script in subdirs_with_files:
|
||||
install_requirements(requirements_file)
|
||||
run_install_script(install_script)
|
||||
@@ -1,21 +0,0 @@
|
||||
git clone https://github.com/comfyanonymous/ComfyUI
|
||||
cd ComfyUI/custom_nodes
|
||||
git clone https://github.com/ltdrdata/ComfyUI-Manager comfyui-manager
|
||||
cd ..
|
||||
python -m venv venv
|
||||
source venv/bin/activate
|
||||
python -m pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu121
|
||||
python -m pip install -r requirements.txt
|
||||
python -m pip install -r custom_nodes/comfyui-manager/requirements.txt
|
||||
cd ..
|
||||
echo "#!/bin/bash" > run_gpu.sh
|
||||
echo "cd ComfyUI" >> run_gpu.sh
|
||||
echo "source venv/bin/activate" >> run_gpu.sh
|
||||
echo "python main.py --preview-method auto" >> run_gpu.sh
|
||||
chmod +x run_gpu.sh
|
||||
|
||||
echo "#!/bin/bash" > run_cpu.sh
|
||||
echo "cd ComfyUI" >> run_cpu.sh
|
||||
echo "source venv/bin/activate" >> run_cpu.sh
|
||||
echo "python main.py --preview-method auto --cpu" >> run_cpu.sh
|
||||
chmod +x run_cpu.sh
|
||||
@@ -1,17 +0,0 @@
|
||||
git clone https://github.com/comfyanonymous/ComfyUI
|
||||
cd ComfyUI/custom_nodes
|
||||
git clone https://github.com/ltdrdata/ComfyUI-Manager comfyui-manager
|
||||
cd ..
|
||||
python -m venv venv
|
||||
call venv/Scripts/activate
|
||||
python -m pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu121
|
||||
python -m pip install -r requirements.txt
|
||||
python -m pip install -r custom_nodes/comfyui-manager/requirements.txt
|
||||
cd ..
|
||||
echo "cd ComfyUI" >> run_gpu.bat
|
||||
echo "call venv/Scripts/activate" >> run_gpu.bat
|
||||
echo "python main.py" >> run_gpu.bat
|
||||
|
||||
echo "cd ComfyUI" >> run_cpu.bat
|
||||
echo "call venv/Scripts/activate" >> run_cpu.bat
|
||||
echo "python main.py --cpu" >> run_cpu.bat
|
||||
@@ -1,3 +0,0 @@
|
||||
.\python_embeded\python.exe -s -m pip install gitpython
|
||||
.\python_embeded\python.exe -c "import git; git.Repo.clone_from('https://github.com/ltdrdata/ComfyUI-Manager', './ComfyUI/custom_nodes/comfyui-manager')"
|
||||
.\python_embeded\python.exe -m pip install -r ./ComfyUI/custom_nodes/comfyui-manager/requirements.txt
|
||||
34
tests/.gitignore
vendored
Normal file
34
tests/.gitignore
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
# Test environment and artifacts
|
||||
|
||||
# Virtual environment
|
||||
test_venv/
|
||||
venv/
|
||||
env/
|
||||
|
||||
# pytest cache
|
||||
.pytest_cache/
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.pyo
|
||||
|
||||
# Coverage reports (module-specific naming)
|
||||
.coverage
|
||||
.coverage.*
|
||||
htmlcov*/
|
||||
coverage*.xml
|
||||
*.cover
|
||||
|
||||
# Test artifacts
|
||||
.tox/
|
||||
.hypothesis/
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
181
tests/README.md
Normal file
181
tests/README.md
Normal file
@@ -0,0 +1,181 @@
|
||||
# ComfyUI Manager Test Suite
|
||||
|
||||
This directory contains all tests for the ComfyUI Manager project, organized by module structure.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
tests/
|
||||
├── setup_test_env.sh # Setup isolated test environment
|
||||
├── requirements.txt # Test dependencies
|
||||
├── pytest.ini # Global pytest configuration
|
||||
├── .gitignore # Ignore test artifacts
|
||||
│
|
||||
└── common/ # Tests for comfyui_manager/common/
|
||||
└── pip_util/ # Tests for pip_util.py
|
||||
├── README.md # pip_util test documentation
|
||||
├── conftest.py # pip_util test fixtures
|
||||
├── pytest.ini # pip_util-specific pytest config
|
||||
└── test_*.py # Actual test files (to be created)
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Setup Test Environment (One Time)
|
||||
|
||||
```bash
|
||||
cd tests
|
||||
./setup_test_env.sh
|
||||
```
|
||||
|
||||
This creates an isolated virtual environment with all test dependencies.
|
||||
|
||||
### 2. Run Tests
|
||||
|
||||
```bash
|
||||
# Activate test environment
|
||||
source test_venv/bin/activate
|
||||
|
||||
# Run all tests from root
|
||||
cd tests
|
||||
pytest
|
||||
|
||||
# Run specific module tests
|
||||
cd tests/common/pip_util
|
||||
pytest
|
||||
|
||||
# Deactivate when done
|
||||
deactivate
|
||||
```
|
||||
|
||||
## Test Organization
|
||||
|
||||
Tests mirror the source code structure:
|
||||
|
||||
| Source Code | Test Location |
|
||||
|-------------|---------------|
|
||||
| `comfyui_manager/common/pip_util.py` | `tests/common/pip_util/test_*.py` |
|
||||
| `comfyui_manager/common/other.py` | `tests/common/other/test_*.py` |
|
||||
| `comfyui_manager/module/file.py` | `tests/module/file/test_*.py` |
|
||||
|
||||
## Writing Tests
|
||||
|
||||
1. Create test directory matching source structure
|
||||
2. Add `conftest.py` for module-specific fixtures
|
||||
3. Add `pytest.ini` for module-specific configuration (optional)
|
||||
4. Create `test_*.py` files with actual tests
|
||||
5. Document in module-specific README
|
||||
|
||||
## Test Categories
|
||||
|
||||
Use pytest markers to categorize tests:
|
||||
|
||||
```python
|
||||
@pytest.mark.unit
|
||||
def test_simple_function():
|
||||
pass
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_complex_workflow():
|
||||
pass
|
||||
|
||||
@pytest.mark.e2e
|
||||
def test_full_system():
|
||||
pass
|
||||
```
|
||||
|
||||
Run by category:
|
||||
```bash
|
||||
pytest -m unit # Only unit tests
|
||||
pytest -m integration # Only integration tests
|
||||
pytest -m e2e # Only end-to-end tests
|
||||
```
|
||||
|
||||
## Coverage Reports
|
||||
|
||||
Coverage reports are generated per module:
|
||||
|
||||
```bash
|
||||
cd tests/common/pip_util
|
||||
pytest # Generates htmlcov_pip_util/ and coverage_pip_util.xml
|
||||
```
|
||||
|
||||
## Environment Isolation
|
||||
|
||||
**Why use venv?**
|
||||
- ✅ Prevents test dependencies from corrupting main environment
|
||||
- ✅ Allows safe package installation/uninstallation during tests
|
||||
- ✅ Consistent test results across machines
|
||||
- ✅ Easy to recreate clean environment
|
||||
|
||||
## Available Test Modules
|
||||
|
||||
- **[common/pip_util](common/pip_util/)** - Policy-based pip package management system tests
|
||||
- Unit tests for policy loading, parsing, condition evaluation
|
||||
- Integration tests for policy application (60% of tests)
|
||||
- End-to-end workflow tests
|
||||
|
||||
## Adding New Test Modules
|
||||
|
||||
1. Create directory structure: `tests/module_path/component_name/`
|
||||
2. Add `conftest.py` with fixtures
|
||||
3. Add `pytest.ini` if needed (optional)
|
||||
4. Add `README.md` documenting the tests
|
||||
5. Create `test_*.py` files
|
||||
|
||||
Example:
|
||||
```bash
|
||||
mkdir -p tests/data_models/config
|
||||
cd tests/data_models/config
|
||||
touch conftest.py README.md test_config_loader.py
|
||||
```
|
||||
|
||||
## CI/CD Integration
|
||||
|
||||
Tests are designed to run in CI/CD pipelines:
|
||||
|
||||
```yaml
|
||||
# Example GitHub Actions
|
||||
- name: Setup test environment
|
||||
run: |
|
||||
cd tests
|
||||
./setup_test_env.sh
|
||||
|
||||
- name: Run tests
|
||||
run: |
|
||||
source tests/test_venv/bin/activate
|
||||
pytest tests/
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Import errors
|
||||
```bash
|
||||
# Make sure venv is activated
|
||||
source test_venv/bin/activate
|
||||
|
||||
# Verify Python path
|
||||
python -c "import sys; print(sys.path)"
|
||||
```
|
||||
|
||||
### Tests not discovered
|
||||
```bash
|
||||
# Check pytest configuration
|
||||
pytest --collect-only
|
||||
|
||||
# Verify test file naming (must start with test_)
|
||||
ls test_*.py
|
||||
```
|
||||
|
||||
### Clean rebuild
|
||||
```bash
|
||||
# Remove and recreate test environment
|
||||
rm -rf test_venv/
|
||||
./setup_test_env.sh
|
||||
```
|
||||
|
||||
## Resources
|
||||
|
||||
- **pytest Documentation**: https://docs.pytest.org/
|
||||
- **Coverage.py**: https://coverage.readthedocs.io/
|
||||
- **Module-specific READMEs**: Check each test module directory
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user