Porównaj commity

...

359 Commity

Autor SHA1 Wiadomość Data
Travis Fischer 3f5d8e6595 🏎 2023-04-02 11:30:44 -05:00
Travis Fischer 9bb7da32e4
Merge pull request #504 from 189/fix/defer-save 2023-04-02 11:29:49 -05:00
Travis Fischer 81caf0ba11
Merge branch 'main' into fix/defer-save 2023-04-02 11:29:43 -05:00
Travis Fischer 0f966dc402 chore: minor fixes 2023-04-02 11:27:20 -05:00
Travis Fischer db6e5941ab
Merge pull request #507 from alxmiron/tokenize-response 2023-04-02 11:23:00 -05:00
Travis Fischer 599e197c09
Merge pull request #511 from zhengxs2018/main 2023-04-02 11:16:46 -05:00
Travis Fischer c0d79e3819
Merge branch 'main' into main 2023-04-02 11:16:26 -05:00
Travis Fischer a3b165f7e4
Merge pull request #509 from gfl94/patch-1 2023-04-02 11:15:32 -05:00
Travis Fischer c5abff264e
Merge pull request #510 from benjiJanssens/main 2023-04-02 11:15:18 -05:00
Travis Fischer bc94ed116a
Merge pull request #518 from Jarvan-via/jarvan 2023-04-02 11:15:05 -05:00
Travis Fischer 59b20771a9
Update readme.md 2023-04-02 11:14:57 -05:00
Travis Fischer 828510c1a0 5.1.4 2023-04-01 14:40:06 -05:00
Travis Fischer e1d536e62f feat: add cli support for model 2023-04-01 14:39:47 -05:00
袁勇 afa5aea5df feat: add OpenAI-API-Service for readme 2023-04-01 18:44:39 +08:00
阿森 de78787bc9 style: prettier code 2023-03-31 16:31:20 +08:00
阿森 79a9d50ec0
Update chatgpt-api.ts 2023-03-31 15:32:37 +08:00
阿森 52d3918255
feat: support multiple organizations 2023-03-31 15:29:07 +08:00
Benjamin Janssens 507a296249 Added SlackGPT to the README 2023-03-30 22:47:13 +02:00
Gefei Li 4da3ea6e1f
add a wechat bot with wechaty+chatgpt api 2023-03-30 19:53:50 +08:00
Oleksii Myronenko 7cd6e06bdd #477 Mark tokens as estimated 2023-03-29 23:48:50 +03:00
Oleksii Myronenko dd6b457932 #477 Estimate tokens usage with streaming completions 2023-03-29 23:34:06 +03:00
Travis Fischer 0984504154
Merge pull request #506 from T-Damer/patch-1 2023-03-29 14:49:42 -05:00
Daniil Pankov cf29a97c0b
Update how to use with commonJS
Add another workaround for commonJS libraries
2023-03-29 20:42:49 +03:00
wangshangwen 44a4a9121e fix: defer saving new message util got response from openAI 2023-03-28 12:00:12 +08:00
Travis Fischer 4613b95cc3 5.1.3 2023-03-27 22:13:51 -05:00
Travis Fischer 5f239e53bb chore: update ts docs 2023-03-27 22:13:37 -05:00
Travis Fischer 9eac18fac0 feat: update deps 2023-03-27 22:13:12 -05:00
Travis Fischer 0655b632e2
Merge pull request #500 from christophebe/main 2023-03-27 22:12:25 -05:00
Travis Fischer b4f59cb51d
Merge branch 'main' into main 2023-03-27 22:12:20 -05:00
Travis Fischer 912ca2432d
Merge pull request #498 from Elitezen/main 2023-03-27 22:11:32 -05:00
Travis Fischer c8b7351889
Update readme.md 2023-03-27 22:11:15 -05:00
Travis Fischer 020c21b173
Merge pull request #496 from zhengxs2018/patch-1 2023-03-27 22:10:08 -05:00
Travis Fischer 172c9ffd1b
Merge pull request #494 from formulahendry/patch-1 2023-03-27 22:08:59 -05:00
Christophe 052238fb62
Add new project : Julius GPT 2023-03-26 21:50:06 +02:00
Elitezen 34ac22408e Added discordjs-chatgpt under Projects 2023-03-25 17:12:20 -04:00
阿森 15058598fb
docs: add one WeChat Bot to readme
Thanks 🙏
2023-03-25 09:34:28 +08:00
Jun Han c869d1fc0b
docs: add one WeChat Bot to readme 2023-03-24 21:53:02 +08:00
Travis Fischer 9932fca55c
Merge pull request #493 from acheong08/patch-3 2023-03-24 01:29:28 -05:00
Antonio Cheong d61ab51d64
Update endpoint
My domain name expires in 5 days. Need to swap. Everything is exactly the same.
2023-03-24 14:19:10 +08:00
Travis Fischer 7d9d092dcb
Merge pull request #491 from PawanOsman/main 2023-03-23 10:02:07 -05:00
Pawan Osman ff40424d9b Update reverse proxy url and details 2023-03-23 10:18:00 +03:00
Travis Fischer 753c61c7b6
Merge pull request #485 from shixin-guo/main 2023-03-20 17:31:34 -05:00
士心 68cbd04aaa
doc: add a zoom chat based on chatgpt 2023-03-21 00:00:37 +08:00
Travis Fischer 523d076ba9
Merge pull request #479 from azoway/main 2023-03-18 08:57:29 -05:00
Travis Fischer c32d986c2b 5.1.2 2023-03-18 08:55:11 -05:00
Travis Fischer 66be552761 feat: update tokenizer 2023-03-18 08:54:48 -05:00
Travis Fischer fa8012c523 chore: update dev deps 2023-03-18 08:52:35 -05:00
azoway b574020a74
Update readme.md 2023-03-18 21:43:45 +08:00
Travis Fischer 48ad624904 fix: CI 2023-03-15 18:43:50 -05:00
Travis Fischer fcc84b61e4 feat: add gpt-4 demo 2023-03-15 18:16:29 -05:00
Travis Fischer 320ac10b2a 5.1.1 2023-03-15 18:00:58 -05:00
Travis Fischer 0e6f9d9591 5.1.0 2023-03-15 18:00:02 -05:00
Travis Fischer e21126e4e4 feat: add support for completionParams to sendMessage 2023-03-15 17:59:43 -05:00
Travis Fischer 5fef0f6ead
Merge pull request #464 from adi6409/main 2023-03-12 03:48:24 -05:00
Travis Fischer 2b4efd44e9 5.0.10 2023-03-11 05:08:27 -06:00
Travis Fischer fc3aad3d7d chore: update ts docs 2023-03-11 05:08:14 -06:00
Travis Fischer c4ffe539a4 feat: update ChatGPTUnofficialProxyAPI to new default proxy 2023-03-11 05:07:49 -06:00
Travis Fischer dab8d5fff4 feat: update deps 2023-03-11 05:05:48 -06:00
Travis Fischer 9ce9eadc10
Merge pull request #465 from acheong08/patch-2 2023-03-11 05:03:00 -06:00
Antonio Cheong d1d1585b04
new reverse proxy 2023-03-11 18:12:49 +08:00
Adi Stroianu e16296d471 moved /v1 to the default base url, to enable support for pawan's free api. 2023-03-10 23:57:17 +02:00
Travis Fischer 950100ab64 5.0.9 2023-03-09 12:41:27 -06:00
Travis Fischer 874248d959 chore: simplify some code 2023-03-09 12:41:02 -06:00
Travis Fischer 93a5094654
Merge pull request #457 from bytemain/main 2023-03-09 12:35:15 -06:00
Travis Fischer a9b0baeec6
Merge branch 'main' into main 2023-03-09 12:35:09 -06:00
Travis Fischer 8b2a7d36c4
Merge pull request #459 from yi-ge/main 2023-03-09 12:28:49 -06:00
Travis Fischer b1d21baa08
Merge pull request #460 from kidonng/patch-1 2023-03-09 12:27:13 -06:00
Travis Fischer 8691389aaa
Update readme.md 2023-03-09 12:27:03 -06:00
Kid c48fc9baf4
docs: fix incomplete text 2023-03-10 01:47:21 +08:00
yi-ge 8f6ee88f22 to: Allow the user string to be empty to get the remaining information 2023-03-10 00:29:57 +08:00
yi-ge b72727d174 fix: last detail lost 2023-03-09 23:42:00 +08:00
野声 d9cdf1563b fix: count prompt token error 2023-03-09 14:55:50 +08:00
Travis Fischer 5e256934a6
Merge pull request #447 from maxlibin/main 2023-03-08 00:43:01 -06:00
max c2b7a20706
Add ai itinerary project using chatgpt 2023-03-08 12:26:51 +08:00
Travis Fischer 5959fcedd3 5.0.8 2023-03-07 01:46:58 -06:00
Travis Fischer 34727e0edc fix: set maxModelTokens to 4000 to be conservative and avoid limits 2023-03-07 01:46:37 -06:00
Travis Fischer acfaa50875 5.0.7 2023-03-06 01:19:03 -06:00
Travis Fischer 87d802e8c8 2023-03-06 01:16:13 -06:00
Travis Fischer 785a61f742 fix: cli apiKey 2023-03-06 01:06:17 -06:00
Travis Fischer 0f1417f235
Merge pull request #433 from shulandmimi/main 2023-03-06 01:05:08 -06:00
shulandmimi 7921b52446 fix: can't use apiKey flag 2023-03-05 15:14:06 +08:00
Travis Fischer 0ed1ad0035 📮 2023-03-04 04:32:49 -06:00
Travis Fischer f27568b65b 5.0.6 2023-03-03 17:25:44 -06:00
Travis Fischer 15cfd10a59
Merge pull request #425 from hyln9/patch-1 2023-03-03 17:25:01 -06:00
Yule Hou dc8bcdee20
fix: use official system message by default
Fixing a typo in the default system message which sometimes causes
weird problems, including failure to retrieve certain information.
Additionally, using the official knowledge cutoff seems to help.
2023-03-04 00:04:39 +08:00
Travis Fischer e65699ed24 5.0.5 2023-03-02 17:25:40 -06:00
Travis Fischer c87f9c7c8e chore: update ts docs 2023-03-02 17:25:26 -06:00
Travis Fischer 48cb94473d chore: update ChatGPTAPI constructor init for typedoc 2023-03-02 17:24:49 -06:00
Travis Fischer ad3d1f9951
Merge pull request #422 from NoodleOfDeath/main 2023-03-02 17:22:25 -06:00
Travis Fischer a733b29861 feat: switch tokenizer to use cl100k_base encoding for gpt-3.5-turbo model 2023-03-02 17:13:21 -06:00
NoodleOfDeath 1a0b570dc9 feat: exports init options for external use 2023-03-02 13:19:45 -05:00
Travis Fischer e7f24b8c3d 5.0.4 2023-03-01 23:45:06 -06:00
Travis Fischer b294bc9792 🐭 2023-03-01 23:44:54 -06:00
Travis Fischer 997b5d2e84 5.0.3 2023-03-01 23:32:53 -06:00
Travis Fischer 5c49e20e3e chore: update docs 2023-03-01 23:32:33 -06:00
Travis Fischer 1e4ddd6b84 fix: openai types 2023-03-01 23:32:13 -06:00
Travis Fischer c0bf83a812 5.0.2 2023-03-01 23:25:23 -06:00
Travis Fischer 0a70f13ee4 feat: switch default model to non-date version 2023-03-01 23:25:08 -06:00
Travis Fischer 20966f765f 5.0.1 2023-03-01 21:50:41 -06:00
Travis Fischer 6e256e9721 fix: cli streaming issue 2023-03-01 21:50:22 -06:00
Travis Fischer 26a16fb5c4 🍚 2023-03-01 21:10:03 -06:00
Travis Fischer ffb31acb26
Merge pull request #407 from youngle316/add-nextjschatgpt 2023-03-01 21:07:51 -06:00
Travis Fischer 6f08ece929
Merge branch 'main' into add-nextjschatgpt 2023-03-01 21:07:42 -06:00
Travis Fischer 2842a553c4
Update readme.md 2023-03-01 21:07:19 -06:00
Travis Fischer 62f267cd5f
Merge pull request #406 from insulineru/add-aicommits 2023-03-01 21:06:44 -06:00
Travis Fischer bdda9a4415 👄 2023-03-01 20:58:42 -06:00
Travis Fischer 8d94d98506 5.0.0 2023-03-01 20:58:22 -06:00
Travis Fischer 5c6a0274b0 🐟 2023-03-01 20:58:06 -06:00
Travis Fischer 657720904d 🚈 2023-03-01 20:55:56 -06:00
Travis Fischer 1b326dbe8d 🔫 2023-03-01 20:54:58 -06:00
Travis Fischer 47b615d1d3
Merge pull request #412 from transitive-bullshit/feature/official-chat-completions-api 2023-03-01 20:54:18 -06:00
Travis Fischer 067a75854b 🦋 2023-03-01 20:54:10 -06:00
Travis Fischer fc429ec1ca 🐛 2023-03-01 20:51:30 -06:00
Travis Fischer 8ea449f26f fix: bugfix 2023-03-01 20:51:07 -06:00
Travis Fischer 271ed9d753 feat: add support for official OpenAI chat completions API 2023-03-01 20:49:20 -06:00
杨乐乐 0aa6f377bc docs: add NextJS chatgpt to README.md 2023-03-01 18:06:35 +08:00
Aleksey Lisun 5d62fe288b
chore: add ai-commit library to projects 2023-03-01 14:54:23 +07:00
Travis Fischer 8efe9a730c 4.8.3 2023-02-28 20:46:12 -06:00
Travis Fischer 8e26eb437f 🌑 2023-02-28 20:45:57 -06:00
Travis Fischer e3ee7272da fix: tokenizer special tokens 2023-02-28 20:44:48 -06:00
Travis Fischer 05eac22e9f 4.8.2 2023-02-28 16:58:35 -06:00
Travis Fischer a6d3c1ce02 😳 2023-02-28 16:57:51 -06:00
Travis Fischer 942f414df4
Merge pull request #403 from NoodleOfDeath/main 2023-02-28 16:54:31 -06:00
NoodleOfDeath 1bc24d249a fix: update tiktoken dep 2023-02-28 13:22:11 -05:00
Travis Fischer 37369f555f 4.8.1 2023-02-28 04:23:11 -06:00
Travis Fischer 539aa6d45c feat: check for invalid conversationId and parentMessageId 2023-02-28 04:21:30 -06:00
Travis Fischer 17b81456d2 4.8.0 2023-02-28 04:07:18 -06:00
Travis Fischer f2a66f83ce 2023-02-28 04:02:46 -06:00
Travis Fischer 16b29b6f17
Merge pull request #387 from gencay/patch-1 2023-02-28 04:00:17 -06:00
Travis Fischer 721291630b
Merge pull request #397 from dannysantino/whatsapp-bot 2023-02-28 03:56:17 -06:00
Travis Fischer 202b041f01
Merge pull request #391 from tehfonsi/main 2023-02-28 03:55:45 -06:00
Travis Fischer be8a89dce7
Update readme.md 2023-02-28 03:55:33 -06:00
Travis Fischer d706918958 🤛 2023-02-28 03:50:43 -06:00
Travis Fischer e6784d6ce6 🐝 2023-02-28 03:50:11 -06:00
Travis Fischer 920344f19f
Merge pull request #393 from AllanOricil/update-readme 2023-02-28 03:44:28 -06:00
Travis Fischer aaa482b5f0
Merge pull request #390 from transitive-bullshit/feature/rust-wasm-tokenizer 2023-02-28 03:43:11 -06:00
Travis Fischer a5d891ac16
Merge pull request #398 from acheong08/patch-1 2023-02-26 18:56:02 -06:00
Antonio Cheong 0409ecd6f1
Rate limit increased
Added more servers
2023-02-27 08:41:03 +08:00
Danny Santino 77484e03ac
docs: add new whatsapp chatbot to README.md 2023-02-26 09:16:30 +01:00
Travis Fischer edd364ef3f
Merge pull request #396 from IsiteYves/fixing-typos 2023-02-26 01:39:14 -06:00
Travis Fischer 5be8159daa
Update readme.md 2023-02-26 01:38:57 -06:00
Isite Yves 0d8dea1cbd
Fix typo in README.md 2023-02-25 15:35:20 +02:00
Allan Oricil 4f33703d42 add node js lib in the Access Token section 2023-02-24 17:06:02 -03:00
Stephan Schober 0eeab418a3 docs: add ai poem generator to projects 2023-02-24 12:55:16 +01:00
Ali Gençay a3e05c2b70
Write the underlying exception details to streamed response
Currently, when stream is enabled the `fetchSSE` isn't returning the api response details.
2023-02-23 04:14:08 -08:00
Travis Fischer a51ecdb2cd feat: switch to rust wasm port of tiktoken tokenizer 2023-02-23 00:30:40 -06:00
Travis Fischer 1fdf218b37 docs: tweak wording 2023-02-22 17:07:52 -06:00
Travis Fischer 6abb4319a9 🐲 2023-02-21 22:59:43 -06:00
Travis Fischer 862e9eeee6 4.7.2 2023-02-20 19:45:15 -06:00
Travis Fischer 8943cd35a6
Merge pull request #372 from fjc0k/main 2023-02-20 19:44:47 -06:00
Travis Fischer 96c2afe655
Merge pull request #373 from billylo1/main 2023-02-20 19:44:19 -06:00
Travis Fischer e346e85e54
Update readme.md 2023-02-20 19:43:56 -06:00
Billy Lo dd25790ed1
Merge pull request #1 from billylo1/billylo1-patch-1
add DomainGPT as an interesting use case for chatgpt-api
2023-02-20 12:57:09 -05:00
Billy Lo 52a5abc30a
add DomainGPT as an interesting use case for chatgpt-api 2023-02-20 12:55:37 -05:00
Jay Fong f0ce92559e fix(ChatGPTUnofficialProxyAPI): pass fetch 2023-02-20 16:51:20 +00:00
Travis Fischer 70e4797615 🏜 2023-02-19 17:43:19 -06:00
Travis Fischer efa1ed8e96 2023-02-19 17:37:28 -06:00
Travis Fischer 96c4762f0e 4.7.1 2023-02-19 17:00:37 -06:00
Travis Fischer e65a54f37b fix: cwd bug with packageJson 2023-02-19 17:00:21 -06:00
Travis Fischer 394badfba5 4.7.0 2023-02-19 06:29:44 -06:00
Travis Fischer 42d163b0b1 chore: update ts docs 2023-02-19 06:29:07 -06:00
Travis Fischer 607fccfc5a 🥒 2023-02-19 06:28:47 -06:00
Travis Fischer df0d6ef8e8 🔻 2023-02-19 06:26:17 -06:00
Travis Fischer 535d220e29 🛍 2023-02-19 06:25:32 -06:00
Travis Fischer 7c3a894115 🌿 2023-02-19 06:24:59 -06:00
Travis Fischer d9c307cc18 feat: improve CLI and add to readme 2023-02-19 06:17:26 -06:00
Travis Fischer 1224e0cb2e fix: fix cli build => ts 2023-02-19 04:39:49 -06:00
Travis Fischer 5f7609c0ad
Merge pull request #363 from zeke/add-cli 2023-02-19 04:34:31 -06:00
Travis Fischer b1f10a6c5d
Merge pull request #359 from linjungz/main 2023-02-19 04:26:01 -06:00
Travis Fischer ab933631f4 4.6.0 2023-02-19 03:48:35 -06:00
Travis Fischer d8eeb1a736 feat: switch from gpt-3-encoder to gpt3-tokenizer 2023-02-19 03:48:06 -06:00
Travis Fischer fc9869abf5 🚸 2023-02-19 03:25:11 -06:00
Travis Fischer 2460e5af93 🚬 2023-02-19 03:21:13 -06:00
Travis Fischer 218bdf81ab 💐 2023-02-19 03:18:05 -06:00
Travis Fischer bc3a6acaad 👤 2023-02-19 02:57:39 -06:00
Travis Fischer 7412564d65 4.5.1 2023-02-19 02:52:08 -06:00
Travis Fischer 31932298fd chore: update ts docs 2023-02-19 02:51:53 -06:00
Travis Fischer 6cf60eea77 🔈 2023-02-19 02:37:31 -06:00
Travis Fischer ef0285571d 4.5.0 2023-02-19 02:17:47 -06:00
Travis Fischer ef0e341557
Merge pull request #364 from transitive-bullshit/feature/unofficial-proxy-api 2023-02-19 02:17:03 -06:00
Travis Fischer 28ae98d79b 🐏 2023-02-19 02:14:46 -06:00
Travis Fischer 6110f232ea 🚍 2023-02-19 02:12:59 -06:00
Travis Fischer e1ad9eaaea 🤛 2023-02-19 02:10:58 -06:00
Travis Fischer 29ac58487a 🔑 2023-02-19 02:09:39 -06:00
Travis Fischer c5c1937388 😰 2023-02-19 02:09:19 -06:00
Travis Fischer a5272b721f 🛒 2023-02-19 02:06:33 -06:00
Travis Fischer b9255873f2 🗡 2023-02-19 02:05:05 -06:00
Travis Fischer ac5326516c 💞 2023-02-19 02:04:19 -06:00
Travis Fischer 26c48d1cf2 📘 2023-02-19 02:03:12 -06:00
Travis Fischer 6d81552509 feat: add ChatGPTUnofficialProxyAPI 2023-02-19 01:56:34 -06:00
Zeke Sikelianos 30f3abd471 add a CLI 2023-02-18 22:56:49 -08:00
Randy Lin 2b1fafae75
Update readme.md
Add a link for Feishu Bot
2023-02-19 00:05:56 +08:00
Travis Fischer f2f209657d
Merge pull request #354 from waynejohny/patch-1 2023-02-16 23:26:20 -06:00
waynejohny 60c43c1c42
Update readme.md
Typo demo-persistence.ts with demo-conversation.ts in the readme demo
2023-02-17 11:41:11 +08:00
Travis Fischer 72d35b8f1b 4.4.1 2023-02-15 17:01:21 -06:00
Travis Fischer 7f530fdc36
Merge pull request #346 from hvqzao/main 2023-02-15 17:00:33 -06:00
hvqzao 6f17b5946e Log to console only in debug mode 2023-02-15 18:42:42 +01:00
Travis Fischer a6105f5f6a chore: update ts docs 2023-02-14 00:31:49 -06:00
Travis Fischer 4379d52dd5 4.4.0 2023-02-14 00:30:23 -06:00
Travis Fischer 1bffd5e92a feat: add ability to override global "fetch" 2023-02-14 00:30:06 -06:00
Travis Fischer 0fc6575b80 4.3.3 2023-02-14 00:18:46 -06:00
Travis Fischer d621fb7c45 feat: remove reverse proxy URL from demo 2023-02-14 00:18:21 -06:00
Travis Fischer 8af7e8311d 📣 2023-02-13 01:43:47 -06:00
Travis Fischer cea8df65c1 4.3.2 2023-02-13 00:42:29 -06:00
Travis Fischer d0fba024aa 🎟 2023-02-13 00:42:11 -06:00
Travis Fischer 57ecbb3d4d
Merge pull request #328 from hujanais/main 2023-02-13 00:28:51 -06:00
Travis Fischer 9a9866e724 4.3.1 2023-02-13 00:21:02 -06:00
Travis Fischer 5b2402a200 4.3.0 2023-02-13 00:20:24 -06:00
Travis Fischer 70a91d47ef 🎴 2023-02-13 00:20:11 -06:00
Travis Fischer 1777c4551d feat: add support for using a reverse proxy to use the official ChatGPT models 2023-02-12 23:36:42 -06:00
Travis Fischer 34c886b5c4 🥁 2023-02-12 22:53:43 -06:00
Travis Fischer 0f0c12e275 feat: switch back to text-chat-davinci-002-20221122 2023-02-12 22:53:43 -06:00
Travis Fischer 09d607a530
Merge pull request #329 from youking-lib/main 2023-02-12 15:28:56 -06:00
Travis Fischer 27c5d65df1
Update readme.md 2023-02-12 15:28:48 -06:00
youking-lib 35e6dc25c3
Update readme.md 2023-02-12 22:05:33 +08:00
youking-lib 7d7b312857
Update readme.md 2023-02-12 22:04:21 +08:00
youking-lib 4b05f68ea3
Update readme.md 2023-02-12 22:01:06 +08:00
youking-lib b66f18eefb
doc: add ai-assistant intergration 2023-02-12 21:58:10 +08:00
Teik Lee 539011ee12 added WhatsApp Bot #5 (RaspberryPi + ngrok + Twilio) 2023-02-11 18:20:58 -05:00
Travis Fischer 835959f6b7 4.2.0 2023-02-07 16:04:22 -06:00
Travis Fischer c682cd3fbc feat: switch to text-davinci-003 2023-02-07 16:02:25 -06:00
Travis Fischer 6e9dea04fa 4.1.3 2023-02-07 03:47:30 -06:00
Travis Fischer 78bd4fff43 fix: add catch for fetchSSE promise rejection 2023-02-07 03:47:09 -06:00
Travis Fischer 5b8bb66a18 4.1.2 2023-02-06 19:15:51 -06:00
Travis Fischer ed59af39c2 chore: update ts docs 2023-02-06 19:15:18 -06:00
Travis Fischer aefae230e0
Merge pull request #307 from mvtavares/add_isomorphic_fetch_readme 2023-02-06 19:14:56 -06:00
Travis Fischer 88a7e2298c
Merge pull request #306 from pacholoamit/readme 2023-02-06 19:14:20 -06:00
Travis Fischer e8748fe3bf
Update readme.md 2023-02-06 19:13:54 -06:00
Travis Fischer 34abda1e1d
Merge pull request #305 from MarkusGalant/patch-1 2023-02-06 19:11:27 -06:00
Travis Fischer 7c8bc53242
Merge pull request #299 from Cadienvan/patch-1 2023-02-06 19:10:50 -06:00
marvin 1aaf4d37c5 chore: including isomorphic-fetch 2023-02-06 11:02:21 -03:00
pacholoamit 727b46a019 Add project to readme 2023-02-06 20:00:46 +08:00
MarkusGalant 6167365189
Add Slack Bot #4 project to readme.md 2023-02-06 00:29:54 +02:00
Travis Fischer ba0970d20a
Merge pull request #301 from sebas00/sebas00-patch-1 2023-02-04 18:21:16 -06:00
Travis Fischer 2e9f3c228a
Update readme.md 2023-02-04 18:21:06 -06:00
Sebastiaan de Man 8287ffa215
Add clippyJS bot to readme 2023-02-04 19:54:29 +01:00
Michael Di Prisco 47e26cc7a2
chore: fixing param stop being array instead of string 2023-02-04 14:54:23 +01:00
Michael Di Prisco 329e076189
chore: moving from im_end to completion params
As we allow the constructor to define completionParams, it would be better to allow devs using the library to use the specified stop token.
As of right now, the library allows such stop to be passed but it isn't considered in the class itself.
2023-02-04 09:44:51 +01:00
Travis Fischer d8072bb3e1
Merge pull request #293 from NessunKim/patch-1 2023-02-03 16:36:41 -06:00
Travis Fischer 15a1ae0b13 4.1.1 2023-02-02 23:34:17 -06:00
Travis Fischer bc79490bf2 feat: update chatgpt model 2023-02-02 23:33:57 -06:00
Myeonghyeon Kim 41eeeb5c26
docs: add NessunKim/slack-chatgpt to README
Hi. I built a simple Slack bot to use this lib!
2023-02-03 12:33:55 +09:00
Travis Fischer 2cb48c9601 4.1.0 2023-02-02 14:12:25 -06:00
Travis Fischer 1c5d0a0e53 feat: replace chat model with text-davinci-003 as temporary workaround 2023-02-02 14:11:39 -06:00
Travis Fischer 58b9ec040d
Merge pull request #291 from gmpetrov/main 2023-02-02 14:08:53 -06:00
Travis Fischer 467785c0fc 4.0.5 2023-02-02 13:59:41 -06:00
Travis Fischer dc4997d511 fix: completionParams 2023-02-02 13:59:08 -06:00
Travis Fischer 82e89b3237 fix: completionParams should be partial optional 2023-02-02 13:58:02 -06:00
Georges Petrov 38565d7f9a
📄 Added openai-chatgpt to readme 2023-02-02 14:48:52 +01:00
Travis Fischer 9ec984da76 4.0.4 2023-02-02 01:49:31 -06:00
Travis Fischer e24ac37aaa 🐂 2023-02-01 23:28:39 -06:00
Travis Fischer c1859b0126 📀 2023-02-01 18:22:16 -06:00
Travis Fischer 47721aac7a 🐽 2023-02-01 18:20:48 -06:00
Travis Fischer b4315cab6e 🚆 2023-02-01 18:20:28 -06:00
Travis Fischer efb1de652b 2023-02-01 18:18:12 -06:00
Travis Fischer d7c757af3b 🕺 2023-02-01 18:17:54 -06:00
Travis Fischer e492f0db95 💉 2023-02-01 18:17:46 -06:00
Travis Fischer f248ad2490 chore: update docs 2023-02-01 18:15:56 -06:00
Travis Fischer 2dd0ca9196 4.0.3 2023-02-01 18:12:59 -06:00
Travis Fischer 6ca86037c7 docs: update readme with better explanations 2023-02-01 18:12:29 -06:00
Travis Fischer 9d49e78415 feat: add more config params and docs 2023-02-01 18:01:34 -06:00
Travis Fischer 3229113fd4 4.0.2 2023-02-01 12:35:18 -06:00
Travis Fischer b20fa74793 fix: conversationId undefined 2023-02-01 12:35:04 -06:00
Travis Fischer 1019db09b8 💴 2023-02-01 07:10:28 -06:00
Travis Fischer ecef4cc154 4.0.1 2023-02-01 07:09:16 -06:00
Travis Fischer 57b18402d0 🦋 2023-02-01 07:09:04 -06:00
Travis Fischer 1f6f08ee69 🐞 2023-02-01 05:52:13 -06:00
Travis Fischer b823c74cdf docs: update readme 2023-02-01 05:10:43 -06:00
Travis Fischer ac0622da13 4.0.0 2023-02-01 05:04:27 -06:00
Travis Fischer af1f57fc2c
Merge pull request #284 from transitive-bullshit/feature/hidden-model-api 2023-02-01 05:03:13 -06:00
Travis Fischer 67b872b76e 🐷 2023-02-01 05:01:14 -06:00
Travis Fischer cd249f9eb2 💍 2023-02-01 04:59:01 -06:00
Travis Fischer 3a0259c128 docs: update readme and auto-gen docs 2023-02-01 04:58:25 -06:00
Travis Fischer 531e180e3f feat: add persistence demo and keyv support 2023-02-01 04:48:36 -06:00
Travis Fischer 21dd9d518f feat: MAJOR BREAKING CHANGE; moved from browser to official completion API with unofficial chatgpt model 2023-02-01 03:14:10 -06:00
Travis Fischer babaab3c72 3.5.1 2023-01-22 01:53:22 -06:00
Travis Fischer 9ee7746d49 chore: update ts docs 2023-01-22 01:53:09 -06:00
Travis Fischer 83bb9cf813 fix; isProAccount should be optional 2023-01-22 01:52:43 -06:00
Travis Fischer 0e28a2b939
Merge pull request #276 from PawanOsman/main 2023-01-22 01:51:27 -06:00
Pawan Osman ba71198ab3 Support for Pro Accounts 2023-01-22 07:43:41 +00:00
Travis Fischer 54a08a562b
Merge pull request #274 from TheBrokenRail/patch-1 2023-01-22 01:05:14 -06:00
Travis Fischer 97cd50afaf
Update readme.md 2023-01-22 01:05:08 -06:00
TheBrokenRail 25997bce33
Add `ShakespeareBot` to the project list
I made a small Discord bot that converts everyone's messages with a certain role to a message in the style of Shakespeare.
2023-01-21 22:16:36 -05:00
Travis Fischer fa94838019 3.5.0 2023-01-20 18:44:30 -06:00
Travis Fischer b016c001eb chore: update readme 2023-01-20 18:44:01 -06:00
Travis Fischer 791b2f4d70
Merge pull request #273 from itskdhere/main 2023-01-20 18:42:59 -06:00
Travis Fischer 25ca608df1
Merge pull request #271 from kodjunkie/patch-2 2023-01-20 18:41:46 -06:00
Krishnendu Das e28e638eb5
Added `itskdhere/ChatGPT-Discord-BOT` in list 2023-01-20 02:35:03 +05:30
Lawrence Onah 55b56693ba chore: update pnpm-lock.yaml 2023-01-19 14:38:56 +01:00
Lawrence Onah e21a690e9e chore: add support for userDataDir 2023-01-19 14:31:08 +01:00
Travis Fischer b5698e3ae2 chore: remove unused code 2023-01-18 00:21:05 -06:00
Travis Fischer c3bf0c31a0
Merge pull request #257 from Kladdy/patch-1 2023-01-15 05:17:38 -06:00
Travis Fischer 410b247e23 3.4.2 2023-01-15 03:53:55 -06:00
Travis Fischer 0c11a4d766 feat: improve support for proxy auth 2023-01-15 03:53:31 -06:00
Travis Fischer 1d784766c5
Merge pull request #259 from thanhsonng/main 2023-01-14 17:47:04 -06:00
Sơn Nguyễn 9a434777da feat: Add new project to readme 2023-01-14 23:03:38 +00:00
Sigfrid Stjärnholm 6d63943d07
Add project to project list 2023-01-12 15:08:37 +01:00
Travis Fischer 824e73f128 🍘 2023-01-12 04:07:37 -06:00
Travis Fischer 4c196ba863 🏺 2023-01-12 04:06:33 -06:00
Travis Fischer 09e3a322d6 🤧 2023-01-12 04:06:05 -06:00
Travis Fischer 98d4121c81 3.4.1 2023-01-12 04:05:04 -06:00
Travis Fischer 71c95073e3 chore: update ts docs 2023-01-12 04:04:51 -06:00
Travis Fischer 4a0f780db6 docs: update readme 2023-01-12 04:04:31 -06:00
Travis Fischer 0636c37ddc 2023-01-12 03:57:22 -06:00
Travis Fischer efbd1146f5 3.4.0 2023-01-12 03:55:36 -06:00
Travis Fischer e0fd5f4652 feat: add onProgress to ChatGPTAPIBrowser.sendMessage 2023-01-12 03:55:20 -06:00
Travis Fischer 525524b848 3.3.13 2023-01-11 03:42:36 -06:00
Travis Fischer c64d3055cc feat: add bypass for 2023-01-09 update modal 2023-01-11 03:42:13 -06:00
Travis Fischer a70d6f5452 feat: add google auth demo 2023-01-11 03:41:50 -06:00
Travis Fischer 980c8fd0aa feat: add waylaidwanderer/node-chatgpt-api to readme 2023-01-10 17:15:39 -06:00
Travis Fischer 30e63cf552
Merge pull request #242 from noelzappy/main 2023-01-10 04:31:26 -06:00
Travis Fischer 3bb14e06b9
Update readme.md 2023-01-10 04:31:08 -06:00
Travis Fischer 026c6865db
Merge pull request #236 from mitkodkn/patch-1 2023-01-10 04:29:49 -06:00
Travis Fischer ba965a043b 3.3.12 2023-01-10 03:08:49 -06:00
Travis Fischer 1923f775e1
Merge pull request #249 from czzonet/main 2023-01-10 03:08:17 -06:00
czzonet 384db42dcb fix JSON parse error 2023-01-10 15:54:54 +08:00
Emmanuel Yeboah 6a78462937
Update readme.md 2023-01-09 10:29:12 +00:00
Emmanuel Yeboah 34134f193f
Update readme.md 2023-01-09 10:25:56 +00:00
Dimitar K. Nikolov 3378746bcf
chore: add NestJS starter boilerplate to readme 2023-01-07 11:37:19 +02:00
Travis Fischer d4067ca455 3.3.11 2023-01-06 16:05:38 -06:00
Travis Fischer b6649b9484 fix: add slight delay for password => login click 2023-01-06 16:05:07 -06:00
Travis Fischer 158cc7fb56
Merge pull request #232 from xtremehpx/main 2023-01-05 21:18:59 -06:00
Travis Fischer 99f5c69e17
Update readme.md 2023-01-05 21:18:30 -06:00
xtremehpx 6d95f77cf3 Merge remote-tracking branch 'upstream/main' 2023-01-05 13:33:05 -08:00
xtremehpx 336a91d31d
Update readme.md
Add Wordsmith project
2023-01-05 13:18:17 -08:00
Travis Fischer 8c7ab5e90f
Merge pull request #226 from jcromero/patch-1 2023-01-03 02:30:18 -06:00
jcromero 562aafce95
Update readme.me
Updated URL for matrix bot for ChatGPT
2023-01-03 09:15:59 +01:00
Travis Fischer 8ac22d0537 3.3.10 2023-01-02 17:35:58 -06:00
Travis Fischer fe9bcd3bb6 chore: update ts docs 2023-01-02 17:28:55 -06:00
Travis Fischer 025e7e3a24 docs: update demo to remove redundant browser conversation demo 2023-01-02 17:28:35 -06:00
Travis Fischer 726550b55b
Merge pull request #225 from jwgatt/main 2023-01-02 17:25:28 -06:00
Travis Fischer 9ce1b0cf4f chore: update year to 2023 2023-01-02 16:48:37 -06:00
jwgatt 19966aae2d update authclass 2023-01-02 22:27:08 +01:00
Travis Fischer 5ff3191766
Merge pull request #223 from ArdaGnsrn/main 2023-01-01 19:26:21 -06:00
Travis Fischer f2c6c6ba67
Merge pull request #215 from RusDyn/patch-1 2023-01-01 19:25:43 -06:00
Travis Fischer 46be9c3aa1
Update readme.md 2023-01-01 19:25:27 -06:00
Arda Günsüren 99083bace8
Update readme.md 2023-01-02 02:18:18 +02:00
Travis Fischer 2ac4fe6c07
Merge pull request #220 from f/patch-1 2022-12-31 23:05:14 -06:00
Fatih Kadir Akın f6c08f6a8a
optionally, easier way to get userAgent 2023-01-01 03:24:07 +03:00
Travis Fischer 7b8c7fb05d 3.3.9 2022-12-31 15:22:12 -06:00
Travis Fischer c1ec537903 fix: minor PR cleanups 2022-12-31 15:21:55 -06:00
Travis Fischer 5c7b9d64a4
Merge pull request #219 from fuergaosi233/feat/impove-robustness
Fix undefined
2022-12-31 15:18:33 -06:00
Holegots e98f740f1a ️ Improve robustness 2023-01-01 01:50:37 +08:00
RusDyn b112d57f40
Update readme.md 2022-12-30 12:12:15 +00:00
Travis Fischer c29a5f3fdd 3.3.8 2022-12-29 01:18:24 -06:00
Travis Fischer 9e2779ef5a 📛 2022-12-29 01:15:25 -06:00
Travis Fischer 6187ded9e1 🔖 2022-12-29 01:14:41 -06:00
Travis Fischer 94e7db388a
Merge pull request #210 from codextde/main 2022-12-29 01:14:21 -06:00
Travis Fischer f9794629b5 3.3.7 2022-12-28 16:33:42 -06:00
Travis Fischer 10db1601a0 feat: minor improvements to captcha bypass 2022-12-28 16:33:24 -06:00
Daniel Ehrhardt 9f62f52c5a
Added YouTube Video Summary Telegram Bot 2022-12-28 23:22:56 +01:00
Travis Fischer d0506dad6b
Merge pull request #208 from lokwkin/adding-slackbot-integration 2022-12-28 15:07:23 -06:00
lokwkin 50b2500e7f adding slackbot integration 2022-12-29 00:29:05 +08:00
Travis Fischer 77d1d8e089 3.3.6 2022-12-28 02:23:50 -06:00
Travis Fischer 533a65db1c feat: puppeteer robustness improvements 2022-12-28 01:58:29 -06:00
Travis Fischer d7615577bf 3.3.5 2022-12-27 12:04:45 -06:00
Travis Fischer a4f9407ae0
Merge pull request #205 from suhaotian/hotfix/fix-no-element-found 2022-12-27 12:03:57 -06:00
suhaotian 345b5be4ec fix: No element found for selector: button[data-provider="google"] 2022-12-27 19:26:32 +08:00
Travis Fischer 63e7c47ab0 3.3.4 2022-12-27 00:52:50 -06:00
80 zmienionych plików z 14313 dodań i 8871 usunięć

Wyświetl plik

@ -6,8 +6,7 @@
# ------------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# ChatGPT
# OpenAI
# -----------------------------------------------------------------------------
OPENAI_EMAIL=
OPENAI_PASSWORD=
OPENAI_API_KEY=

Wyświetl plik

@ -23,7 +23,7 @@ body:
- type: textarea
attributes:
label: Environment details
description: Please enter Node.js version, browser version, OS, and OS version.
description: Please enter Node.js version, browser version, and OS version.
validations:
required: true
- type: textarea

146
bin/cli.js 100755
Wyświetl plik

@ -0,0 +1,146 @@
#!/usr/bin/env node
import crypto from 'node:crypto'
import * as url from 'url'
import { cac } from 'cac'
import Conf from 'conf'
import { readPackageUp } from 'read-pkg-up'
import { ChatGPTAPI } from '../build/index.js'
async function main() {
const dirname = url.fileURLToPath(new URL('.', import.meta.url))
const pkg = await readPackageUp({ cwd: dirname })
const version = (pkg && pkg.packageJson && pkg.packageJson.version) || '4'
const config = new Conf({ projectName: 'chatgpt' })
const cli = cac('chatgpt')
cli
.command('<prompt>', 'Ask ChatGPT a question')
.option('-c, --continue', 'Continue last conversation', {
default: false
})
.option('-d, --debug', 'Enables debug logging', {
default: false
})
.option('-s, --stream', 'Streams the response', {
default: true
})
.option('-s, --store', 'Enables the local message cache', {
default: true
})
.option('-t, --timeout <timeout>', 'Timeout in milliseconds')
.option('-k, --apiKey <apiKey>', 'OpenAI API key')
.option('-o, --apiOrg <apiOrg>', 'OpenAI API key')
.option('-m, --model <model>', 'Model (gpt-3.5-turbo, gpt-4)', {
default: 'gpt-3.5-turbo'
})
.option(
'-n, --conversationName <conversationName>',
'Unique name for the conversation'
)
.action(async (prompt, options) => {
const apiOrg = options.apiOrg || process.env.OPENAI_API_ORG
const apiKey = options.apiKey || process.env.OPENAI_API_KEY
if (!apiKey) {
console.error('error: either set OPENAI_API_KEY or use --apiKey\n')
cli.outputHelp()
process.exit(1)
}
const apiKeyHash = hash(apiKey)
const conversationName = options.conversationName || 'default'
const conversationKey = `${conversationName}:${apiKeyHash}`
const conversation =
options.continue && options.store
? config.get(conversationKey, {}) || {}
: {}
const model = options.model
let conversationId = undefined
let parentMessageId = undefined
if (conversation.lastMessageId) {
const lastMessage = conversation[conversation.lastMessageId]
if (lastMessage) {
conversationId = lastMessage.conversationId
parentMessageId = lastMessage.id
}
}
if (options.debug) {
console.log('using config', config.path)
}
const api = new ChatGPTAPI({
apiKey,
apiOrg,
debug: options.debug,
completionParams: {
model
},
getMessageById: async (id) => {
if (options.store) {
return conversation[id]
} else {
return null
}
},
upsertMessage: async (message) => {
if (options.store) {
conversation[message.id] = message
conversation.lastMessageId = message.id
config.set(conversationKey, conversation)
}
}
})
const res = await api.sendMessage(prompt, {
conversationId,
parentMessageId,
timeoutMs: options.timeout || undefined,
onProgress: options.stream
? (progress) => {
if (progress.delta) {
process.stdout.write(progress.delta)
}
}
: undefined
})
if (options.stream) {
process.stdout.write('\n')
} else {
console.log(res.text)
}
})
cli.command('rm-cache', 'Clears the local message cache').action(() => {
config.clear()
console.log('cleared cache', config.path)
})
cli.command('ls-cache', 'Prints the local message cache path').action(() => {
console.log(config.path)
})
cli.help()
cli.version(version)
try {
cli.parse()
} catch (err) {
console.error(`error: ${err.message}\n`)
cli.outputHelp()
process.exit(1)
}
}
function hash(d) {
const buffer = Buffer.isBuffer(d) ? d : Buffer.from(d.toString())
return crypto.createHash('sha256').update(buffer).digest('hex')
}
main().catch((err) => {
console.error(err)
process.exit(1)
})

Wyświetl plik

@ -1,81 +0,0 @@
import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'
import { ChatGPTAPIBrowser } from '../src'
dotenv.config()
/**
* Demo CLI for testing conversation support.
*
* ```
* npx tsx demos/demo-conversation-browser.ts
* ```
*/
async function main() {
const email = process.env.OPENAI_EMAIL
const password = process.env.OPENAI_PASSWORD
const api = new ChatGPTAPIBrowser({
email,
password,
debug: false,
minimize: true
})
await api.initSession()
const prompt = 'Write a poem about cats.'
let res = await oraPromise(api.sendMessage(prompt), {
text: prompt
})
console.log('\n' + res.response + '\n')
const prompt2 = 'Can you make it cuter and shorter?'
res = await oraPromise(
api.sendMessage(prompt2, {
conversationId: res.conversationId,
parentMessageId: res.messageId
}),
{
text: prompt2
}
)
console.log('\n' + res.response + '\n')
const prompt3 = 'Now write it in French.'
res = await oraPromise(
api.sendMessage(prompt3, {
conversationId: res.conversationId,
parentMessageId: res.messageId
}),
{
text: prompt3
}
)
console.log('\n' + res.response + '\n')
const prompt4 = 'What were we talking about again?'
res = await oraPromise(
api.sendMessage(prompt4, {
conversationId: res.conversationId,
parentMessageId: res.messageId
}),
{
text: prompt4
}
)
console.log('\n' + res.response + '\n')
// close the browser at the end
await api.closeSession()
}
main().catch((err) => {
console.error(err)
process.exit(1)
})

Wyświetl plik

@ -1,7 +1,7 @@
import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'
import { ChatGPTAPI, getOpenAIAuth } from '../src'
import { ChatGPTAPI } from '../src'
dotenv.config()
@ -13,65 +13,54 @@ dotenv.config()
* ```
*/
async function main() {
const email = process.env.OPENAI_EMAIL
const password = process.env.OPENAI_PASSWORD
const authInfo = await getOpenAIAuth({
email,
password
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: false
})
const api = new ChatGPTAPI({ ...authInfo })
await api.initSession()
const prompt = 'Write a poem about cats.'
let res = await oraPromise(api.sendMessage(prompt), {
text: prompt
})
console.log('\n' + res.response + '\n')
console.log('\n' + res.text + '\n')
const prompt2 = 'Can you make it cuter and shorter?'
res = await oraPromise(
api.sendMessage(prompt2, {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
}),
{
text: prompt2
}
)
console.log('\n' + res.response + '\n')
console.log('\n' + res.text + '\n')
const prompt3 = 'Now write it in French.'
res = await oraPromise(
api.sendMessage(prompt3, {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
}),
{
text: prompt3
}
)
console.log('\n' + res.response + '\n')
console.log('\n' + res.text + '\n')
const prompt4 = 'What were we talking about again?'
res = await oraPromise(
api.sendMessage(prompt4, {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
}),
{
text: prompt4
}
)
console.log('\n' + res.response + '\n')
await api.closeSession()
console.log('\n' + res.text + '\n')
}
main().catch((err) => {

Wyświetl plik

@ -0,0 +1,35 @@
import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'
import { ChatGPTAPI } from '../src'
dotenv.config()
/**
* Demo CLI for testing the GPT-4 model.
*
* ```
* npx tsx demos/demo-gpt-4.ts
* ```
*/
async function main() {
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: true,
completionParams: {
model: 'gpt-4'
}
})
const prompt = 'When should you use Python vs TypeScript?'
const res = await oraPromise(api.sendMessage(prompt), {
text: prompt
})
console.log(res.text)
}
main().catch((err) => {
console.error(err)
process.exit(1)
})

Wyświetl plik

@ -0,0 +1,32 @@
import dotenv from 'dotenv-safe'
import { ChatGPTAPI } from '../src'
dotenv.config()
/**
* Demo CLI for testing the `onProgress` streaming support.
*
* ```
* npx tsx demos/demo-on-progress.ts
* ```
*/
async function main() {
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
const prompt =
'Write a python version of bubble sort. Do not include example usage.'
console.log(prompt)
const res = await api.sendMessage(prompt, {
onProgress: (partialResponse) => {
console.log(partialResponse.text)
}
})
console.log(res.text)
}
main().catch((err) => {
console.error(err)
process.exit(1)
})

Wyświetl plik

@ -0,0 +1,71 @@
import KeyvRedis from '@keyv/redis'
import dotenv from 'dotenv-safe'
import Keyv from 'keyv'
import { oraPromise } from 'ora'
import { ChatGPTAPI, type ChatMessage } from '../src'
dotenv.config()
/**
* Demo CLI for testing message persistence with redis.
*
* ```
* npx tsx demos/demo-persistence.ts
* ```
*/
async function main() {
const redisUrl = process.env.REDIS_URL || 'redis://localhost:6379'
const store = new KeyvRedis(redisUrl)
const messageStore = new Keyv({ store, namespace: 'chatgpt-demo' })
let res: ChatMessage
{
// create an initial conversation in one client
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
messageStore
})
const prompt = 'What are the top 5 anime of all time?'
res = await oraPromise(api.sendMessage(prompt), {
text: prompt
})
console.log('\n' + res.text + '\n')
}
{
// follow up with a second client using the same underlying redis store
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
messageStore
})
const prompt = 'Can you give 5 more?'
res = await oraPromise(
api.sendMessage(prompt, {
parentMessageId: res.id
}),
{
text: prompt
}
)
console.log('\n' + res.text + '\n')
}
// wait for redis to finish and then disconnect
await new Promise<void>((resolve) => {
setTimeout(() => {
messageStore.disconnect()
resolve()
}, 1000)
})
}
main().catch((err) => {
console.error(err)
process.exit(1)
})

Wyświetl plik

@ -0,0 +1,79 @@
import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'
import { ChatGPTUnofficialProxyAPI } from '../src'
dotenv.config()
/**
* Demo for testing conversation support using a reverse proxy which provides
* access to the unofficial ChatGPT API.
*
* ```
* npx tsx demos/demo-reverse-proxy.ts
* ```
*/
async function main() {
// WARNING: this method will expose your access token to a third-party. Please be
// aware of the risks before using this method.
const api = new ChatGPTUnofficialProxyAPI({
// optionally override the default reverse proxy URL (or use one of your own...)
// apiReverseProxyUrl: 'https://chat.duti.tech/api/conversation',
// apiReverseProxyUrl: 'https://gpt.pawan.krd/backend-api/conversation',
accessToken: process.env.OPENAI_ACCESS_TOKEN,
debug: false
})
const prompt = 'Write a poem about cats.'
let res = await oraPromise(api.sendMessage(prompt), {
text: prompt
})
console.log('\n' + res.text + '\n')
const prompt2 = 'Can you make it cuter and shorter?'
res = await oraPromise(
api.sendMessage(prompt2, {
conversationId: res.conversationId,
parentMessageId: res.id
}),
{
text: prompt2
}
)
console.log('\n' + res.text + '\n')
const prompt3 = 'Now write it in French.'
res = await oraPromise(
api.sendMessage(prompt3, {
conversationId: res.conversationId,
parentMessageId: res.id
}),
{
text: prompt3
}
)
console.log('\n' + res.text + '\n')
const prompt4 = 'What were we talking about again?'
res = await oraPromise(
api.sendMessage(prompt4, {
conversationId: res.conversationId,
parentMessageId: res.id
}),
{
text: prompt4
}
)
console.log('\n' + res.text + '\n')
}
main().catch((err) => {
console.error(err)
process.exit(1)
})

Wyświetl plik

@ -1,7 +1,7 @@
import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'
import { ChatGPTAPIBrowser } from '../src'
import { ChatGPTAPI } from '../src'
dotenv.config()
@ -13,16 +13,10 @@ dotenv.config()
* ```
*/
async function main() {
const email = process.env.OPENAI_EMAIL
const password = process.env.OPENAI_PASSWORD
const api = new ChatGPTAPIBrowser({
email,
password,
debug: false,
minimize: true
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: false
})
await api.initSession()
const prompt =
'Write a python version of bubble sort. Do not include example usage.'
@ -30,10 +24,7 @@ async function main() {
const res = await oraPromise(api.sendMessage(prompt), {
text: prompt
})
console.log(res.response)
// close the browser at the end
await api.closeSession()
console.log(res.text)
}
main().catch((err) => {

Wyświetl plik

@ -1,167 +0,0 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / AChatGPTAPI
# Class: AChatGPTAPI
## Hierarchy
- **`AChatGPTAPI`**
↳ [`ChatGPTAPI`](ChatGPTAPI.md)
↳ [`ChatGPTAPIBrowser`](ChatGPTAPIBrowser.md)
## Table of contents
### Constructors
- [constructor](AChatGPTAPI.md#constructor)
### Methods
- [closeSession](AChatGPTAPI.md#closesession)
- [getIsAuthenticated](AChatGPTAPI.md#getisauthenticated)
- [initSession](AChatGPTAPI.md#initsession)
- [refreshSession](AChatGPTAPI.md#refreshsession)
- [resetSession](AChatGPTAPI.md#resetsession)
- [sendMessage](AChatGPTAPI.md#sendmessage)
## Constructors
### constructor
**new AChatGPTAPI**()
## Methods
### closeSession
`Abstract` **closeSession**(): `Promise`<`void`\>
Closes the active session.
**`Throws`**
An error if it fails.
#### Returns
`Promise`<`void`\>
#### Defined in
[src/abstract-chatgpt-api.ts:69](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L69)
___
### getIsAuthenticated
`Abstract` **getIsAuthenticated**(): `Promise`<`boolean`\>
#### Returns
`Promise`<`boolean`\>
`true` if the client is authenticated with a valid session or `false`
otherwise.
#### Defined in
[src/abstract-chatgpt-api.ts:39](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L39)
___
### initSession
`Abstract` **initSession**(): `Promise`<`void`\>
Performs any async initialization work required to ensure that this API is
properly authenticated.
**`Throws`**
An error if the session failed to initialize properly.
#### Returns
`Promise`<`void`\>
#### Defined in
[src/abstract-chatgpt-api.ts:10](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L10)
___
### refreshSession
`Abstract` **refreshSession**(): `Promise`<`any`\>
Refreshes the current ChatGPT session.
Useful for bypassing 403 errors when Cloudflare clearance tokens expire.
**`Throws`**
An error if it fails.
#### Returns
`Promise`<`any`\>
Access credentials for the new session.
#### Defined in
[src/abstract-chatgpt-api.ts:49](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L49)
___
### resetSession
**resetSession**(): `Promise`<`any`\>
Closes the current ChatGPT session and starts a new one.
Useful for bypassing 401 errors when sessions expire.
**`Throws`**
An error if it fails.
#### Returns
`Promise`<`any`\>
Access credentials for the new session.
#### Defined in
[src/abstract-chatgpt-api.ts:59](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L59)
___
### sendMessage
`Abstract` **sendMessage**(`message`, `opts?`): `Promise`<[`ChatResponse`](../modules.md#chatresponse)\>
Sends a message to ChatGPT, waits for the response to resolve, and returns
the response.
If you want to receive a stream of partial responses, use `opts.onProgress`.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `message` | `string` | The prompt message to send |
| `opts?` | [`SendMessageOptions`](../modules.md#sendmessageoptions) | - |
#### Returns
`Promise`<[`ChatResponse`](../modules.md#chatresponse)\>
The response from ChatGPT, including `conversationId`, `messageId`, and
the `response` text.
#### Defined in
[src/abstract-chatgpt-api.ts:30](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L30)

Wyświetl plik

@ -2,12 +2,6 @@
# Class: ChatGPTAPI
## Hierarchy
- [`AChatGPTAPI`](AChatGPTAPI.md)
**`ChatGPTAPI`**
## Table of contents
### Constructors
@ -16,20 +10,11 @@
### Accessors
- [clearanceToken](ChatGPTAPI.md#clearancetoken)
- [sessionToken](ChatGPTAPI.md#sessiontoken)
- [user](ChatGPTAPI.md#user)
- [userAgent](ChatGPTAPI.md#useragent)
- [apiKey](ChatGPTAPI.md#apikey)
### Methods
- [closeSession](ChatGPTAPI.md#closesession)
- [getIsAuthenticated](ChatGPTAPI.md#getisauthenticated)
- [initSession](ChatGPTAPI.md#initsession)
- [refreshSession](ChatGPTAPI.md#refreshsession)
- [resetSession](ChatGPTAPI.md#resetsession)
- [sendMessage](ChatGPTAPI.md#sendmessage)
- [sendModeration](ChatGPTAPI.md#sendmoderation)
## Constructors
@ -37,274 +22,76 @@
**new ChatGPTAPI**(`opts`)
Creates a new client wrapper around the unofficial ChatGPT REST API.
Note that your IP address and `userAgent` must match the same values that you used
to obtain your `clearanceToken`.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `opts` | `Object` | - |
| `opts.accessToken?` | `string` | **`Default Value`** `undefined` * |
| `opts.accessTokenTTL?` | `number` | **`Default Value`** 1 hour * |
| `opts.apiBaseUrl?` | `string` | **`Default Value`** `'https://chat.openai.com/api'` * |
| `opts.backendApiBaseUrl?` | `string` | **`Default Value`** `'https://chat.openai.com/backend-api'` * |
| `opts.clearanceToken` | `string` | = **Required** Cloudflare `cf_clearance` cookie value (see readme for instructions) |
| `opts.debug?` | `boolean` | **`Default Value`** `false` * |
| `opts.headers?` | `Record`<`string`, `string`\> | **`Default Value`** `undefined` * |
| `opts.markdown?` | `boolean` | **`Default Value`** `true` * |
| `opts.sessionToken` | `string` | = **Required** OpenAI session token which can be found in a valid session's cookies (see readme for instructions) |
| `opts.userAgent?` | `string` | **`Default Value`** `Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'` * |
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[constructor](AChatGPTAPI.md#constructor)
#### Defined in
[src/chatgpt-api.ts:45](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L45)
## Accessors
### clearanceToken
`get` **clearanceToken**(): `string`
Gets the current Cloudflare clearance token (`cf_clearance` cookie value).
#### Returns
`string`
#### Defined in
[src/chatgpt-api.ts:143](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L143)
___
### sessionToken
`get` **sessionToken**(): `string`
Gets the current session token.
#### Returns
`string`
#### Defined in
[src/chatgpt-api.ts:138](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L138)
___
### user
`get` **user**(): [`User`](../modules.md#user)
Gets the currently signed-in user, if authenticated, `null` otherwise.
#### Returns
[`User`](../modules.md#user)
#### Defined in
[src/chatgpt-api.ts:133](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L133)
___
### userAgent
`get` **userAgent**(): `string`
Gets the current user agent.
#### Returns
`string`
#### Defined in
[src/chatgpt-api.ts:148](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L148)
## Methods
### closeSession
**closeSession**(): `Promise`<`void`\>
Closes the active session.
**`Throws`**
An error if it fails.
#### Returns
`Promise`<`void`\>
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[closeSession](AChatGPTAPI.md#closesession)
#### Defined in
[src/chatgpt-api.ts:470](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L470)
___
### getIsAuthenticated
**getIsAuthenticated**(): `Promise`<`boolean`\>
#### Returns
`Promise`<`boolean`\>
`true` if the client has a valid acces token or `false` if refreshing
the token fails.
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[getIsAuthenticated](AChatGPTAPI.md#getisauthenticated)
#### Defined in
[src/chatgpt-api.ts:367](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L367)
___
### initSession
**initSession**(): `Promise`<`void`\>
Refreshes the client's access token which will succeed only if the session
is valid.
#### Returns
`Promise`<`void`\>
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[initSession](AChatGPTAPI.md#initsession)
#### Defined in
[src/chatgpt-api.ts:156](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L156)
___
### refreshSession
**refreshSession**(): `Promise`<`string`\>
Attempts to refresh the current access token using the ChatGPT
`sessionToken` cookie.
Access tokens will be cached for up to `accessTokenTTL` milliseconds to
prevent refreshing access tokens too frequently.
**`Throws`**
An error if refreshing the access token fails.
#### Returns
`Promise`<`string`\>
A valid access token
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[refreshSession](AChatGPTAPI.md#refreshsession)
#### Defined in
[src/chatgpt-api.ts:386](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L386)
___
### resetSession
**resetSession**(): `Promise`<`any`\>
Closes the current ChatGPT session and starts a new one.
Useful for bypassing 401 errors when sessions expire.
**`Throws`**
An error if it fails.
#### Returns
`Promise`<`any`\>
Access credentials for the new session.
#### Inherited from
[AChatGPTAPI](AChatGPTAPI.md).[resetSession](AChatGPTAPI.md#resetsession)
#### Defined in
[src/abstract-chatgpt-api.ts:59](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/abstract-chatgpt-api.ts#L59)
___
### sendMessage
**sendMessage**(`message`, `opts?`): `Promise`<[`ChatResponse`](../modules.md#chatresponse)\>
Sends a message to ChatGPT, waits for the response to resolve, and returns
the response.
If you want to receive a stream of partial responses, use `opts.onProgress`.
If you want to receive the full response, including message and conversation IDs,
you can use `opts.onConversationResponse` or use the `ChatGPTAPI.getConversation`
helper.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `message` | `string` | The prompt message to send |
| `opts` | [`SendMessageOptions`](../modules.md#sendmessageoptions) | - |
#### Returns
`Promise`<[`ChatResponse`](../modules.md#chatresponse)\>
The response from ChatGPT
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[sendMessage](AChatGPTAPI.md#sendmessage)
#### Defined in
[src/chatgpt-api.ts:180](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L180)
___
### sendModeration
**sendModeration**(`input`): `Promise`<[`ModerationsJSONResult`](../modules.md#moderationsjsonresult)\>
Creates a new client wrapper around OpenAI's chat completion API, mimicing the official ChatGPT webapp's functionality as closely as possible.
#### Parameters
| Name | Type |
| :------ | :------ |
| `input` | `string` |
#### Returns
`Promise`<[`ModerationsJSONResult`](../modules.md#moderationsjsonresult)\>
| `opts` | [`ChatGPTAPIOptions`](../modules.md#chatgptapioptions) |
#### Defined in
[src/chatgpt-api.ts:324](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api.ts#L324)
[src/chatgpt-api.ts:49](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-api.ts#L49)
## Accessors
### apiKey
`get` **apiKey**(): `string`
#### Returns
`string`
#### Defined in
[src/chatgpt-api.ts:311](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-api.ts#L311)
`set` **apiKey**(`apiKey`): `void`
#### Parameters
| Name | Type |
| :------ | :------ |
| `apiKey` | `string` |
#### Returns
`void`
#### Defined in
[src/chatgpt-api.ts:315](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-api.ts#L315)
## Methods
### sendMessage
**sendMessage**(`text`, `opts?`): `Promise`<[`ChatMessage`](../interfaces/ChatMessage.md)\>
Sends a message to the OpenAI chat completions endpoint, waits for the response
to resolve, and returns the response.
If you want your response to have historical context, you must provide a valid `parentMessageId`.
If you want to receive a stream of partial responses, use `opts.onProgress`.
Set `debug: true` in the `ChatGPTAPI` constructor to log more info on the full prompt sent to the OpenAI chat completions API. You can override the `systemMessage` in `opts` to customize the assistant's instructions.
#### Parameters
| Name | Type |
| :------ | :------ |
| `text` | `string` |
| `opts` | [`SendMessageOptions`](../modules.md#sendmessageoptions) |
#### Returns
`Promise`<[`ChatMessage`](../interfaces/ChatMessage.md)\>
The response from ChatGPT
#### Defined in
[src/chatgpt-api.ts:132](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-api.ts#L132)

Wyświetl plik

@ -1,274 +0,0 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / ChatGPTAPIBrowser
# Class: ChatGPTAPIBrowser
## Hierarchy
- [`AChatGPTAPI`](AChatGPTAPI.md)
**`ChatGPTAPIBrowser`**
## Table of contents
### Constructors
- [constructor](ChatGPTAPIBrowser.md#constructor)
### Accessors
- [isChatPage](ChatGPTAPIBrowser.md#ischatpage)
### Methods
- [\_onRequest](ChatGPTAPIBrowser.md#_onrequest)
- [\_onResponse](ChatGPTAPIBrowser.md#_onresponse)
- [closeSession](ChatGPTAPIBrowser.md#closesession)
- [getIsAuthenticated](ChatGPTAPIBrowser.md#getisauthenticated)
- [initSession](ChatGPTAPIBrowser.md#initsession)
- [refreshSession](ChatGPTAPIBrowser.md#refreshsession)
- [resetSession](ChatGPTAPIBrowser.md#resetsession)
- [resetThread](ChatGPTAPIBrowser.md#resetthread)
- [sendMessage](ChatGPTAPIBrowser.md#sendmessage)
## Constructors
### constructor
**new ChatGPTAPIBrowser**(`opts`)
Creates a new client for automating the ChatGPT webapp.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `opts` | `Object` | - |
| `opts.captchaToken?` | `string` | **`Default Value`** `undefined` * |
| `opts.debug?` | `boolean` | **`Default Value`** `false` * |
| `opts.email` | `string` | - |
| `opts.executablePath?` | `string` | **`Default Value`** `undefined` * |
| `opts.isGoogleLogin?` | `boolean` | **`Default Value`** `false` * |
| `opts.isMicrosoftLogin?` | `boolean` | **`Default Value`** `false` * |
| `opts.markdown?` | `boolean` | **`Default Value`** `true` * |
| `opts.minimize?` | `boolean` | **`Default Value`** `true` * |
| `opts.password` | `string` | - |
| `opts.proxyServer?` | `string` | **`Default Value`** `undefined` * |
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[constructor](AChatGPTAPI.md#constructor)
#### Defined in
[src/chatgpt-api-browser.ts:38](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L38)
## Accessors
### isChatPage
`get` **isChatPage**(): `boolean`
#### Returns
`boolean`
#### Defined in
[src/chatgpt-api-browser.ts:599](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L599)
## Methods
### \_onRequest
**_onRequest**(`request`): `void`
#### Parameters
| Name | Type |
| :------ | :------ |
| `request` | `HTTPRequest` |
#### Returns
`void`
#### Defined in
[src/chatgpt-api-browser.ts:204](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L204)
___
### \_onResponse
**_onResponse**(`response`): `Promise`<`void`\>
#### Parameters
| Name | Type |
| :------ | :------ |
| `response` | `HTTPResponse` |
#### Returns
`Promise`<`void`\>
#### Defined in
[src/chatgpt-api-browser.ts:241](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L241)
___
### closeSession
**closeSession**(): `Promise`<`void`\>
Closes the active session.
**`Throws`**
An error if it fails.
#### Returns
`Promise`<`void`\>
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[closeSession](AChatGPTAPI.md#closesession)
#### Defined in
[src/chatgpt-api-browser.ts:574](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L574)
___
### getIsAuthenticated
**getIsAuthenticated**(): `Promise`<`boolean`\>
#### Returns
`Promise`<`boolean`\>
`true` if the client is authenticated with a valid session or `false`
otherwise.
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[getIsAuthenticated](AChatGPTAPI.md#getisauthenticated)
#### Defined in
[src/chatgpt-api-browser.ts:335](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L335)
___
### initSession
**initSession**(): `Promise`<`void`\>
Performs any async initialization work required to ensure that this API is
properly authenticated.
**`Throws`**
An error if the session failed to initialize properly.
#### Returns
`Promise`<`void`\>
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[initSession](AChatGPTAPI.md#initsession)
#### Defined in
[src/chatgpt-api-browser.ts:106](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L106)
___
### refreshSession
**refreshSession**(): `Promise`<`void`\>
Attempts to handle 403 errors by refreshing the page.
#### Returns
`Promise`<`void`\>
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[refreshSession](AChatGPTAPI.md#refreshsession)
#### Defined in
[src/chatgpt-api-browser.ts:313](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L313)
___
### resetSession
**resetSession**(): `Promise`<`void`\>
Attempts to handle 401 errors by re-authenticating.
#### Returns
`Promise`<`void`\>
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[resetSession](AChatGPTAPI.md#resetsession)
#### Defined in
[src/chatgpt-api-browser.ts:294](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L294)
___
### resetThread
**resetThread**(): `Promise`<`void`\>
#### Returns
`Promise`<`void`\>
#### Defined in
[src/chatgpt-api-browser.ts:566](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L566)
___
### sendMessage
**sendMessage**(`message`, `opts?`): `Promise`<[`ChatResponse`](../modules.md#chatresponse)\>
Sends a message to ChatGPT, waits for the response to resolve, and returns
the response.
If you want to receive a stream of partial responses, use `opts.onProgress`.
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `message` | `string` | The prompt message to send |
| `opts` | [`SendMessageOptions`](../modules.md#sendmessageoptions) | - |
#### Returns
`Promise`<[`ChatResponse`](../modules.md#chatresponse)\>
The response from ChatGPT, including `conversationId`, `messageId`, and
the `response` text.
#### Overrides
[AChatGPTAPI](AChatGPTAPI.md).[sendMessage](AChatGPTAPI.md#sendmessage)
#### Defined in
[src/chatgpt-api-browser.ts:412](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/chatgpt-api-browser.ts#L412)

Wyświetl plik

@ -16,8 +16,8 @@
### Properties
- [originalError](ChatGPTError.md#originalerror)
- [response](ChatGPTError.md#response)
- [accountId](ChatGPTError.md#accountid)
- [isFinal](ChatGPTError.md#isfinal)
- [statusCode](ChatGPTError.md#statuscode)
- [statusText](ChatGPTError.md#statustext)
@ -39,7 +39,7 @@ Error.constructor
#### Defined in
node_modules/.pnpm/typescript@4.9.3/node_modules/typescript/lib/lib.es5.d.ts:1059
node_modules/.pnpm/typescript@4.9.5/node_modules/typescript/lib/lib.es5.d.ts:1059
**new ChatGPTError**(`message?`, `options?`)
@ -56,27 +56,27 @@ Error.constructor
#### Defined in
node_modules/.pnpm/typescript@4.9.3/node_modules/typescript/lib/lib.es2022.error.d.ts:30
node_modules/.pnpm/typescript@4.9.5/node_modules/typescript/lib/lib.es2022.error.d.ts:30
## Properties
### originalError
### accountId
`Optional` **originalError**: `Error`
`Optional` **accountId**: `string`
#### Defined in
[src/types.ts:297](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L297)
[src/types.ts:80](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L80)
___
### response
### isFinal
`Optional` **response**: `Response`
`Optional` **isFinal**: `boolean`
#### Defined in
[src/types.ts:296](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L296)
[src/types.ts:79](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L79)
___
@ -86,7 +86,7 @@ ___
#### Defined in
[src/types.ts:294](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L294)
[src/types.ts:77](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L77)
___
@ -96,4 +96,4 @@ ___
#### Defined in
[src/types.ts:295](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L295)
[src/types.ts:78](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L78)

Wyświetl plik

@ -0,0 +1,104 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / ChatGPTUnofficialProxyAPI
# Class: ChatGPTUnofficialProxyAPI
## Table of contents
### Constructors
- [constructor](ChatGPTUnofficialProxyAPI.md#constructor)
### Accessors
- [accessToken](ChatGPTUnofficialProxyAPI.md#accesstoken)
### Methods
- [sendMessage](ChatGPTUnofficialProxyAPI.md#sendmessage)
## Constructors
### constructor
**new ChatGPTUnofficialProxyAPI**(`opts`)
#### Parameters
| Name | Type | Description |
| :------ | :------ | :------ |
| `opts` | `Object` | - |
| `opts.accessToken` | `string` | - |
| `opts.apiReverseProxyUrl?` | `string` | **`Default Value`** `https://bypass.duti.tech/api/conversation` * |
| `opts.debug?` | `boolean` | **`Default Value`** `false` * |
| `opts.fetch?` | (`input`: `RequestInfo` \| `URL`, `init?`: `RequestInit`) => `Promise`<`Response`\> | - |
| `opts.headers?` | `Record`<`string`, `string`\> | **`Default Value`** `undefined` * |
| `opts.model?` | `string` | **`Default Value`** `text-davinci-002-render-sha` * |
#### Defined in
[src/chatgpt-unofficial-proxy-api.ts:20](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-unofficial-proxy-api.ts#L20)
## Accessors
### accessToken
`get` **accessToken**(): `string`
#### Returns
`string`
#### Defined in
[src/chatgpt-unofficial-proxy-api.ts:66](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-unofficial-proxy-api.ts#L66)
`set` **accessToken**(`value`): `void`
#### Parameters
| Name | Type |
| :------ | :------ |
| `value` | `string` |
#### Returns
`void`
#### Defined in
[src/chatgpt-unofficial-proxy-api.ts:70](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-unofficial-proxy-api.ts#L70)
## Methods
### sendMessage
**sendMessage**(`text`, `opts?`): `Promise`<[`ChatMessage`](../interfaces/ChatMessage.md)\>
Sends a message to ChatGPT, waits for the response to resolve, and returns
the response.
If you want your response to have historical context, you must provide a valid `parentMessageId`.
If you want to receive a stream of partial responses, use `opts.onProgress`.
If you want to receive the full response, including message and conversation IDs,
you can use `opts.onConversationResponse` or use the `ChatGPTAPI.getConversation`
helper.
Set `debug: true` in the `ChatGPTAPI` constructor to log more info on the full prompt sent to the OpenAI completions API. You can override the `promptPrefix` and `promptSuffix` in `opts` to customize the prompt.
#### Parameters
| Name | Type |
| :------ | :------ |
| `text` | `string` |
| `opts` | [`SendMessageBrowserOptions`](../modules.md#sendmessagebrowseroptions) |
#### Returns
`Promise`<[`ChatMessage`](../interfaces/ChatMessage.md)\>
The response from ChatGPT
#### Defined in
[src/chatgpt-unofficial-proxy-api.ts:97](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/chatgpt-unofficial-proxy-api.ts#L97)

Wyświetl plik

@ -0,0 +1,96 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / ChatMessage
# Interface: ChatMessage
## Table of contents
### Properties
- [conversationId](ChatMessage.md#conversationid)
- [delta](ChatMessage.md#delta)
- [detail](ChatMessage.md#detail)
- [id](ChatMessage.md#id)
- [name](ChatMessage.md#name)
- [parentMessageId](ChatMessage.md#parentmessageid)
- [role](ChatMessage.md#role)
- [text](ChatMessage.md#text)
## Properties
### conversationId
`Optional` **conversationId**: `string`
#### Defined in
[src/types.ts:73](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L73)
___
### delta
`Optional` **delta**: `string`
#### Defined in
[src/types.ts:67](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L67)
___
### detail
`Optional` **detail**: `any`
#### Defined in
[src/types.ts:68](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L68)
___
### id
**id**: `string`
#### Defined in
[src/types.ts:63](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L63)
___
### name
`Optional` **name**: `string`
#### Defined in
[src/types.ts:66](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L66)
___
### parentMessageId
`Optional` **parentMessageId**: `string`
#### Defined in
[src/types.ts:71](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L71)
___
### role
**role**: [`Role`](../modules.md#role)
#### Defined in
[src/types.ts:65](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L65)
___
### text
**text**: `string`
#### Defined in
[src/types.ts:64](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L64)

Wyświetl plik

@ -0,0 +1,67 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / ChatCompletionRequestMessage
# Interface: ChatCompletionRequestMessage
[openai](../modules/openai.md).ChatCompletionRequestMessage
**`Export`**
**`Interface`**
ChatCompletionRequestMessage
## Table of contents
### Properties
- [content](openai.ChatCompletionRequestMessage.md#content)
- [name](openai.ChatCompletionRequestMessage.md#name)
- [role](openai.ChatCompletionRequestMessage.md#role)
## Properties
### content
**content**: `string`
The contents of the message
**`Memberof`**
ChatCompletionRequestMessage
#### Defined in
[src/types.ts:211](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L211)
___
### name
`Optional` **name**: `string`
The name of the user in a multi-user chat
**`Memberof`**
ChatCompletionRequestMessage
#### Defined in
[src/types.ts:217](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L217)
___
### role
**role**: [`ChatCompletionRequestMessageRoleEnum`](../modules/openai.md#chatcompletionrequestmessageroleenum-1)
The role of the author of this message.
**`Memberof`**
ChatCompletionRequestMessage
#### Defined in
[src/types.ts:205](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L205)

Wyświetl plik

@ -0,0 +1,50 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / ChatCompletionResponseMessage
# Interface: ChatCompletionResponseMessage
[openai](../modules/openai.md).ChatCompletionResponseMessage
**`Export`**
**`Interface`**
ChatCompletionResponseMessage
## Table of contents
### Properties
- [content](openai.ChatCompletionResponseMessage.md#content)
- [role](openai.ChatCompletionResponseMessage.md#role)
## Properties
### content
**content**: `string`
The contents of the message
**`Memberof`**
ChatCompletionResponseMessage
#### Defined in
[src/types.ts:243](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L243)
___
### role
**role**: [`ChatCompletionResponseMessageRoleEnum`](../modules/openai.md#chatcompletionresponsemessageroleenum-1)
The role of the author of this message.
**`Memberof`**
ChatCompletionResponseMessage
#### Defined in
[src/types.ts:237](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L237)

Wyświetl plik

@ -0,0 +1,65 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / CreateChatCompletionDeltaResponse
# Interface: CreateChatCompletionDeltaResponse
[openai](../modules/openai.md).CreateChatCompletionDeltaResponse
## Table of contents
### Properties
- [choices](openai.CreateChatCompletionDeltaResponse.md#choices)
- [created](openai.CreateChatCompletionDeltaResponse.md#created)
- [id](openai.CreateChatCompletionDeltaResponse.md#id)
- [model](openai.CreateChatCompletionDeltaResponse.md#model)
- [object](openai.CreateChatCompletionDeltaResponse.md#object)
## Properties
### choices
**choices**: [{ `delta`: { `content?`: `string` ; `role`: [`Role`](../modules.md#role) } ; `finish_reason`: `string` ; `index`: `number` }]
#### Defined in
[src/types.ts:182](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L182)
___
### created
**created**: `number`
#### Defined in
[src/types.ts:180](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L180)
___
### id
**id**: `string`
#### Defined in
[src/types.ts:178](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L178)
___
### model
**model**: `string`
#### Defined in
[src/types.ts:181](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L181)
___
### object
**object**: ``"chat.completion.chunk"``
#### Defined in
[src/types.ts:179](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L179)

Wyświetl plik

@ -0,0 +1,218 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / CreateChatCompletionRequest
# Interface: CreateChatCompletionRequest
[openai](../modules/openai.md).CreateChatCompletionRequest
**`Export`**
**`Interface`**
CreateChatCompletionRequest
## Table of contents
### Properties
- [frequency\_penalty](openai.CreateChatCompletionRequest.md#frequency_penalty)
- [logit\_bias](openai.CreateChatCompletionRequest.md#logit_bias)
- [max\_tokens](openai.CreateChatCompletionRequest.md#max_tokens)
- [messages](openai.CreateChatCompletionRequest.md#messages)
- [model](openai.CreateChatCompletionRequest.md#model)
- [n](openai.CreateChatCompletionRequest.md#n)
- [presence\_penalty](openai.CreateChatCompletionRequest.md#presence_penalty)
- [stop](openai.CreateChatCompletionRequest.md#stop)
- [stream](openai.CreateChatCompletionRequest.md#stream)
- [temperature](openai.CreateChatCompletionRequest.md#temperature)
- [top\_p](openai.CreateChatCompletionRequest.md#top_p)
- [user](openai.CreateChatCompletionRequest.md#user)
## Properties
### frequency\_penalty
`Optional` **frequency\_penalty**: `number`
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model\'s likelihood to repeat the same line verbatim. [See more information about frequency and presence penalties.](/docs/api-reference/parameter-details)
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:317](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L317)
___
### logit\_bias
`Optional` **logit\_bias**: `object`
Modify the likelihood of specified tokens appearing in the completion. Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:323](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L323)
___
### max\_tokens
`Optional` **max\_tokens**: `number`
The maximum number of tokens allowed for the generated answer. By default, the number of tokens the model can return will be (4096 - prompt tokens).
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:305](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L305)
___
### messages
**messages**: [`ChatCompletionRequestMessage`](openai.ChatCompletionRequestMessage.md)[]
The messages to generate chat completions for, in the [chat format](/docs/guides/chat/introduction).
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:269](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L269)
___
### model
**model**: `string`
ID of the model to use. Currently, only `gpt-3.5-turbo` and `gpt-3.5-turbo-0301` are supported.
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:263](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L263)
___
### n
`Optional` **n**: `number`
How many chat completion choices to generate for each input message.
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:287](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L287)
___
### presence\_penalty
`Optional` **presence\_penalty**: `number`
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model\'s likelihood to talk about new topics. [See more information about frequency and presence penalties.](/docs/api-reference/parameter-details)
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:311](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L311)
___
### stop
`Optional` **stop**: [`CreateChatCompletionRequestStop`](../modules/openai.md#createchatcompletionrequeststop)
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:299](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L299)
___
### stream
`Optional` **stream**: `boolean`
If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message.
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:293](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L293)
___
### temperature
`Optional` **temperature**: `number`
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or `top_p` but not both.
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:275](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L275)
___
### top\_p
`Optional` **top\_p**: `number`
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or `temperature` but not both.
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:281](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L281)
___
### user
`Optional` **user**: `string`
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids).
**`Memberof`**
CreateChatCompletionRequest
#### Defined in
[src/types.ts:329](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L329)

Wyświetl plik

@ -0,0 +1,106 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / CreateChatCompletionResponse
# Interface: CreateChatCompletionResponse
[openai](../modules/openai.md).CreateChatCompletionResponse
**`Export`**
**`Interface`**
CreateChatCompletionResponse
## Table of contents
### Properties
- [choices](openai.CreateChatCompletionResponse.md#choices)
- [created](openai.CreateChatCompletionResponse.md#created)
- [id](openai.CreateChatCompletionResponse.md#id)
- [model](openai.CreateChatCompletionResponse.md#model)
- [object](openai.CreateChatCompletionResponse.md#object)
- [usage](openai.CreateChatCompletionResponse.md#usage)
## Properties
### choices
**choices**: [`CreateChatCompletionResponseChoicesInner`](openai.CreateChatCompletionResponseChoicesInner.md)[]
**`Memberof`**
CreateChatCompletionResponse
#### Defined in
[src/types.ts:372](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L372)
___
### created
**created**: `number`
**`Memberof`**
CreateChatCompletionResponse
#### Defined in
[src/types.ts:360](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L360)
___
### id
**id**: `string`
**`Memberof`**
CreateChatCompletionResponse
#### Defined in
[src/types.ts:348](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L348)
___
### model
**model**: `string`
**`Memberof`**
CreateChatCompletionResponse
#### Defined in
[src/types.ts:366](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L366)
___
### object
**object**: `string`
**`Memberof`**
CreateChatCompletionResponse
#### Defined in
[src/types.ts:354](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L354)
___
### usage
`Optional` **usage**: [`CreateCompletionResponseUsage`](openai.CreateCompletionResponseUsage.md)
**`Memberof`**
CreateChatCompletionResponse
#### Defined in
[src/types.ts:378](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L378)

Wyświetl plik

@ -0,0 +1,61 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / CreateChatCompletionResponseChoicesInner
# Interface: CreateChatCompletionResponseChoicesInner
[openai](../modules/openai.md).CreateChatCompletionResponseChoicesInner
**`Export`**
**`Interface`**
CreateChatCompletionResponseChoicesInner
## Table of contents
### Properties
- [finish\_reason](openai.CreateChatCompletionResponseChoicesInner.md#finish_reason)
- [index](openai.CreateChatCompletionResponseChoicesInner.md#index)
- [message](openai.CreateChatCompletionResponseChoicesInner.md#message)
## Properties
### finish\_reason
`Optional` **finish\_reason**: `string`
**`Memberof`**
CreateChatCompletionResponseChoicesInner
#### Defined in
[src/types.ts:403](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L403)
___
### index
`Optional` **index**: `number`
**`Memberof`**
CreateChatCompletionResponseChoicesInner
#### Defined in
[src/types.ts:391](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L391)
___
### message
`Optional` **message**: [`ChatCompletionResponseMessage`](openai.ChatCompletionResponseMessage.md)
**`Memberof`**
CreateChatCompletionResponseChoicesInner
#### Defined in
[src/types.ts:397](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L397)

Wyświetl plik

@ -0,0 +1,61 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / [openai](../modules/openai.md) / CreateCompletionResponseUsage
# Interface: CreateCompletionResponseUsage
[openai](../modules/openai.md).CreateCompletionResponseUsage
**`Export`**
**`Interface`**
CreateCompletionResponseUsage
## Table of contents
### Properties
- [completion\_tokens](openai.CreateCompletionResponseUsage.md#completion_tokens)
- [prompt\_tokens](openai.CreateCompletionResponseUsage.md#prompt_tokens)
- [total\_tokens](openai.CreateCompletionResponseUsage.md#total_tokens)
## Properties
### completion\_tokens
**completion\_tokens**: `number`
**`Memberof`**
CreateCompletionResponseUsage
#### Defined in
[src/types.ts:422](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L422)
___
### prompt\_tokens
**prompt\_tokens**: `number`
**`Memberof`**
CreateCompletionResponseUsage
#### Defined in
[src/types.ts:416](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L416)
___
### total\_tokens
**total\_tokens**: `number`
**`Memberof`**
CreateCompletionResponseUsage
#### Defined in
[src/types.ts:428](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L428)

Wyświetl plik

@ -4,102 +4,64 @@
## Table of contents
### Namespaces
- [openai](modules/openai.md)
### Classes
- [AChatGPTAPI](classes/AChatGPTAPI.md)
- [ChatGPTAPI](classes/ChatGPTAPI.md)
- [ChatGPTAPIBrowser](classes/ChatGPTAPIBrowser.md)
- [ChatGPTError](classes/ChatGPTError.md)
- [ChatGPTUnofficialProxyAPI](classes/ChatGPTUnofficialProxyAPI.md)
### Interfaces
- [ChatMessage](interfaces/ChatMessage.md)
### Type Aliases
- [AvailableModerationModels](modules.md#availablemoderationmodels)
- [ChatError](modules.md#chaterror)
- [ChatResponse](modules.md#chatresponse)
- [ChatGPTAPIOptions](modules.md#chatgptapioptions)
- [ContentType](modules.md#contenttype)
- [ConversationJSONBody](modules.md#conversationjsonbody)
- [ConversationResponseEvent](modules.md#conversationresponseevent)
- [FetchFn](modules.md#fetchfn)
- [GetMessageByIdFunction](modules.md#getmessagebyidfunction)
- [Message](modules.md#message)
- [MessageActionType](modules.md#messageactiontype)
- [MessageContent](modules.md#messagecontent)
- [MessageFeedbackJSONBody](modules.md#messagefeedbackjsonbody)
- [MessageFeedbackRating](modules.md#messagefeedbackrating)
- [MessageFeedbackResult](modules.md#messagefeedbackresult)
- [MessageFeedbackTags](modules.md#messagefeedbacktags)
- [MessageMetadata](modules.md#messagemetadata)
- [Model](modules.md#model)
- [ModelsResult](modules.md#modelsresult)
- [ModerationsJSONBody](modules.md#moderationsjsonbody)
- [ModerationsJSONResult](modules.md#moderationsjsonresult)
- [OpenAIAuth](modules.md#openaiauth)
- [Prompt](modules.md#prompt)
- [PromptContent](modules.md#promptcontent)
- [Role](modules.md#role)
- [SendConversationMessageOptions](modules.md#sendconversationmessageoptions)
- [SendMessageBrowserOptions](modules.md#sendmessagebrowseroptions)
- [SendMessageOptions](modules.md#sendmessageoptions)
- [SessionResult](modules.md#sessionresult)
- [User](modules.md#user)
### Functions
- [browserPostEventStream](modules.md#browserposteventstream)
- [defaultChromeExecutablePath](modules.md#defaultchromeexecutablepath)
- [getBrowser](modules.md#getbrowser)
- [getOpenAIAuth](modules.md#getopenaiauth)
- [initializeNopechaExtension](modules.md#initializenopechaextension)
- [isRelevantRequest](modules.md#isrelevantrequest)
- [markdownToText](modules.md#markdowntotext)
- [maximizePage](modules.md#maximizepage)
- [minimizePage](modules.md#minimizepage)
- [UpsertMessageFunction](modules.md#upsertmessagefunction)
## Type Aliases
### AvailableModerationModels
### ChatGPTAPIOptions
Ƭ **AvailableModerationModels**: ``"text-moderation-playground"``
#### Defined in
[src/types.ts:109](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L109)
___
### ChatError
Ƭ **ChatError**: `Object`
Ƭ **ChatGPTAPIOptions**: `Object`
#### Type declaration
| Name | Type |
| :------ | :------ |
| `conversationId?` | `string` |
| `error` | { `message`: `string` ; `statusCode?`: `number` ; `statusText?`: `string` } |
| `error.message` | `string` |
| `error.statusCode?` | `number` |
| `error.statusText?` | `string` |
| `messageId?` | `string` |
| Name | Type | Description |
| :------ | :------ | :------ |
| `apiBaseUrl?` | `string` | **`Default Value`** `'https://api.openai.com'` * |
| `apiKey` | `string` | - |
| `completionParams?` | `Partial`<`Omit`<[`CreateChatCompletionRequest`](interfaces/openai.CreateChatCompletionRequest.md), ``"messages"`` \| ``"n"`` \| ``"stream"``\>\> | - |
| `debug?` | `boolean` | **`Default Value`** `false` * |
| `fetch?` | [`FetchFn`](modules.md#fetchfn) | - |
| `getMessageById?` | [`GetMessageByIdFunction`](modules.md#getmessagebyidfunction) | - |
| `maxModelTokens?` | `number` | **`Default Value`** `4096` * |
| `maxResponseTokens?` | `number` | **`Default Value`** `1000` * |
| `messageStore?` | `Keyv` | - |
| `systemMessage?` | `string` | - |
| `upsertMessage?` | [`UpsertMessageFunction`](modules.md#upsertmessagefunction) | - |
#### Defined in
[src/types.ts:300](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L300)
___
### ChatResponse
Ƭ **ChatResponse**: `Object`
#### Type declaration
| Name | Type |
| :------ | :------ |
| `conversationId` | `string` |
| `messageId` | `string` |
| `response` | `string` |
#### Defined in
[src/types.ts:306](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L306)
[src/types.ts:7](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L7)
___
@ -109,7 +71,7 @@ ___
#### Defined in
[src/types.ts:1](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L1)
[src/types.ts:136](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L136)
___
@ -131,7 +93,7 @@ https://chat.openapi.com/backend-api/conversation
#### Defined in
[src/types.ts:134](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L134)
[src/types.ts:92](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L92)
___
@ -149,7 +111,43 @@ ___
#### Defined in
[src/types.ts:251](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L251)
[src/types.ts:150](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L150)
___
### FetchFn
Ƭ **FetchFn**: typeof `fetch`
#### Defined in
[src/types.ts:5](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L5)
___
### GetMessageByIdFunction
Ƭ **GetMessageByIdFunction**: (`id`: `string`) => `Promise`<[`ChatMessage`](interfaces/ChatMessage.md)\>
#### Type declaration
▸ (`id`): `Promise`<[`ChatMessage`](interfaces/ChatMessage.md)\>
Returns a chat message from a store by it's ID (or null if not found).
##### Parameters
| Name | Type |
| :------ | :------ |
| `id` | `string` |
##### Returns
`Promise`<[`ChatMessage`](interfaces/ChatMessage.md)\>
#### Defined in
[src/types.ts:84](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L84)
___
@ -167,14 +165,14 @@ ___
| `id` | `string` |
| `metadata` | [`MessageMetadata`](modules.md#messagemetadata) |
| `recipient` | `string` |
| `role` | `string` |
| `role` | [`Role`](modules.md#role) |
| `update_time` | `string` \| ``null`` |
| `user` | `string` \| ``null`` |
| `weight` | `number` |
#### Defined in
[src/types.ts:257](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L257)
[src/types.ts:156](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L156)
___
@ -184,7 +182,7 @@ ___
#### Defined in
[src/types.ts:276](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L276)
[src/types.ts:50](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L50)
___
@ -201,69 +199,7 @@ ___
#### Defined in
[src/types.ts:270](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L270)
___
### MessageFeedbackJSONBody
Ƭ **MessageFeedbackJSONBody**: `Object`
https://chat.openapi.com/backend-api/conversation/message_feedback
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `conversation_id` | `string` | The ID of the conversation |
| `message_id` | `string` | The message ID |
| `rating` | [`MessageFeedbackRating`](modules.md#messagefeedbackrating) | The rating |
| `tags?` | [`MessageFeedbackTags`](modules.md#messagefeedbacktags)[] | Tags to give the rating |
| `text?` | `string` | The text to include |
#### Defined in
[src/types.ts:193](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L193)
___
### MessageFeedbackRating
Ƭ **MessageFeedbackRating**: ``"thumbsUp"`` \| ``"thumbsDown"``
#### Defined in
[src/types.ts:249](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L249)
___
### MessageFeedbackResult
Ƭ **MessageFeedbackResult**: `Object`
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `conversation_id` | `string` | The ID of the conversation |
| `message_id` | `string` | The message ID |
| `rating` | [`MessageFeedbackRating`](modules.md#messagefeedbackrating) | The rating |
| `text?` | `string` | The text the server received, including tags |
| `user_id` | `string` | The ID of the user |
#### Defined in
[src/types.ts:222](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L222)
___
### MessageFeedbackTags
Ƭ **MessageFeedbackTags**: ``"harmful"`` \| ``"false"`` \| ``"not-helpful"``
#### Defined in
[src/types.ts:220](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L220)
[src/types.ts:169](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L169)
___
@ -273,104 +209,7 @@ ___
#### Defined in
[src/types.ts:275](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L275)
___
### Model
Ƭ **Model**: `Object`
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `is_special` | `boolean` | Whether or not the model is special |
| `max_tokens` | `number` | Max tokens of the model |
| `slug` | `string` | Name of the model |
#### Defined in
[src/types.ts:77](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L77)
___
### ModelsResult
Ƭ **ModelsResult**: `Object`
https://chat.openapi.com/backend-api/models
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `models` | [`Model`](modules.md#model)[] | Array of models |
#### Defined in
[src/types.ts:70](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L70)
___
### ModerationsJSONBody
Ƭ **ModerationsJSONBody**: `Object`
https://chat.openapi.com/backend-api/moderations
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `input` | `string` | Input for the moderation decision |
| `model` | [`AvailableModerationModels`](modules.md#availablemoderationmodels) | The model to use in the decision |
#### Defined in
[src/types.ts:97](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L97)
___
### ModerationsJSONResult
Ƭ **ModerationsJSONResult**: `Object`
https://chat.openapi.com/backend-api/moderations
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `blocked` | `boolean` | Whether or not the input is blocked |
| `flagged` | `boolean` | Whether or not the input is flagged |
| `moderation_id` | `string` | The ID of the decision |
#### Defined in
[src/types.ts:114](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L114)
___
### OpenAIAuth
Ƭ **OpenAIAuth**: `Object`
Represents everything that's required to pass into `ChatGPTAPI` in order
to authenticate with the unofficial ChatGPT API.
#### Type declaration
| Name | Type |
| :------ | :------ |
| `clearanceToken` | `string` |
| `cookies?` | `Record`<`string`, `Protocol.Network.Cookie`\> |
| `sessionToken` | `string` |
| `userAgent` | `string` |
#### Defined in
[src/openai-auth.ts:28](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/openai-auth.ts#L28)
[src/types.ts:174](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L174)
___
@ -388,7 +227,7 @@ ___
#### Defined in
[src/types.ts:161](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L161)
[src/types.ts:119](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L119)
___
@ -405,33 +244,23 @@ ___
#### Defined in
[src/types.ts:178](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L178)
[src/types.ts:138](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L138)
___
### Role
Ƭ **Role**: ``"user"`` \| ``"assistant"``
Ƭ **Role**: ``"user"`` \| ``"assistant"`` \| ``"system"``
#### Defined in
[src/types.ts:3](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L3)
[src/types.ts:3](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L3)
___
### SendConversationMessageOptions
### SendMessageBrowserOptions
Ƭ **SendConversationMessageOptions**: `Omit`<[`SendMessageOptions`](modules.md#sendmessageoptions), ``"conversationId"`` \| ``"parentMessageId"``\>
#### Defined in
[src/types.ts:288](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L288)
___
### SendMessageOptions
Ƭ **SendMessageOptions**: `Object`
Ƭ **SendMessageBrowserOptions**: `Object`
#### Type declaration
@ -441,269 +270,60 @@ ___
| `action?` | [`MessageActionType`](modules.md#messageactiontype) |
| `conversationId?` | `string` |
| `messageId?` | `string` |
| `onProgress?` | (`partialResponse`: [`ChatResponse`](modules.md#chatresponse)) => `void` |
| `onProgress?` | (`partialResponse`: [`ChatMessage`](interfaces/ChatMessage.md)) => `void` |
| `parentMessageId?` | `string` |
| `timeoutMs?` | `number` |
#### Defined in
[src/types.ts:278](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L278)
[src/types.ts:52](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L52)
___
### SessionResult
### SendMessageOptions
Ƭ **SessionResult**: `Object`
https://chat.openapi.com/api/auth/session
Ƭ **SendMessageOptions**: `Object`
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `accessToken` | `string` | The access token |
| `error?` | `string` \| ``null`` | If there was an error associated with this request |
| `expires` | `string` | ISO date of the expiration date of the access token |
| `user` | [`User`](modules.md#user) | Authenticated user |
| `abortSignal?` | `AbortSignal` | - |
| `completionParams?` | `Partial`<`Omit`<[`CreateChatCompletionRequest`](interfaces/openai.CreateChatCompletionRequest.md), ``"messages"`` \| ``"n"`` \| ``"stream"``\>\> | - |
| `messageId?` | `string` | - |
| `name?` | `string` | The name of a user in a multi-user chat. |
| `onProgress?` | (`partialResponse`: [`ChatMessage`](interfaces/ChatMessage.md)) => `void` | - |
| `parentMessageId?` | `string` | - |
| `stream?` | `boolean` | - |
| `systemMessage?` | `string` | - |
| `timeoutMs?` | `number` | - |
#### Defined in
[src/types.ts:8](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L8)
[src/types.ts:35](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L35)
___
### User
### UpsertMessageFunction
Ƭ **User**: `Object`
Ƭ **UpsertMessageFunction**: (`message`: [`ChatMessage`](interfaces/ChatMessage.md)) => `Promise`<`void`\>
#### Type declaration
| Name | Type | Description |
| :------ | :------ | :------ |
| `email?` | `string` | Email of the user |
| `features` | `string`[] | Features the user is in |
| `groups` | `string`[] | Groups the user is in |
| `id` | `string` | ID of the user |
| `image` | `string` | Image of the user |
| `name` | `string` | Name of the user |
| `picture` | `string` | Picture of the user |
▸ (`message`): `Promise`<`void`\>
#### Defined in
Upserts a chat message to a store.
[src/types.ts:30](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/types.ts#L30)
## Functions
### browserPostEventStream
**browserPostEventStream**(`url`, `accessToken`, `body`, `timeoutMs?`): `Promise`<[`ChatError`](modules.md#chaterror) \| [`ChatResponse`](modules.md#chatresponse)\>
This function is injected into the ChatGPT webapp page using puppeteer. It
has to be fully self-contained, so we copied a few third-party sources and
included them in here.
#### Parameters
##### Parameters
| Name | Type |
| :------ | :------ |
| `url` | `string` |
| `accessToken` | `string` |
| `body` | [`ConversationJSONBody`](modules.md#conversationjsonbody) |
| `timeoutMs?` | `number` |
| `message` | [`ChatMessage`](interfaces/ChatMessage.md) |
#### Returns
`Promise`<[`ChatError`](modules.md#chaterror) \| [`ChatResponse`](modules.md#chatresponse)\>
#### Defined in
[src/utils.ts:73](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/utils.ts#L73)
___
### defaultChromeExecutablePath
**defaultChromeExecutablePath**(): `string`
Gets the default path to chrome's executable for the current platform.
#### Returns
`string`
#### Defined in
[src/openai-auth.ts:463](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/openai-auth.ts#L463)
___
### getBrowser
**getBrowser**(`opts?`): `Promise`<`Browser`\>
Launches a non-puppeteer instance of Chrome. Note that in my testing, I wasn't
able to use the built-in `puppeteer` version of Chromium because Cloudflare
recognizes it and blocks access.
#### Parameters
| Name | Type |
| :------ | :------ |
| `opts` | `PuppeteerLaunchOptions` & { `captchaToken?`: `string` ; `minimize?`: `boolean` ; `nopechaKey?`: `string` ; `proxyServer?`: `string` } |
#### Returns
`Promise`<`Browser`\>
#### Defined in
[src/openai-auth.ts:248](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/openai-auth.ts#L248)
___
### getOpenAIAuth
**getOpenAIAuth**(`__namedParameters`): `Promise`<[`OpenAIAuth`](modules.md#openaiauth)\>
Bypasses OpenAI's use of Cloudflare to get the cookies required to use
ChatGPT. Uses Puppeteer with a stealth plugin under the hood.
If you pass `email` and `password`, then it will log into the account and
include a `sessionToken` in the response.
If you don't pass `email` and `password`, then it will just return a valid
`clearanceToken`.
This can be useful because `clearanceToken` expires after ~2 hours, whereas
`sessionToken` generally lasts much longer. We recommend renewing your
`clearanceToken` every hour or so and creating a new instance of `ChatGPTAPI`
with your updated credentials.
#### Parameters
| Name | Type |
| :------ | :------ |
| `__namedParameters` | `Object` |
| `__namedParameters.browser?` | `Browser` |
| `__namedParameters.captchaToken?` | `string` |
| `__namedParameters.email?` | `string` |
| `__namedParameters.executablePath?` | `string` |
| `__namedParameters.isGoogleLogin?` | `boolean` |
| `__namedParameters.isMicrosoftLogin?` | `boolean` |
| `__namedParameters.minimize?` | `boolean` |
| `__namedParameters.nopechaKey?` | `string` |
| `__namedParameters.page?` | `Page` |
| `__namedParameters.password?` | `string` |
| `__namedParameters.proxyServer?` | `string` |
| `__namedParameters.timeoutMs?` | `number` |
#### Returns
`Promise`<[`OpenAIAuth`](modules.md#openaiauth)\>
#### Defined in
[src/openai-auth.ts:50](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/openai-auth.ts#L50)
___
### initializeNopechaExtension
**initializeNopechaExtension**(`browser`, `opts`): `Promise`<`void`\>
#### Parameters
| Name | Type |
| :------ | :------ |
| `browser` | `Browser` |
| `opts` | `Object` |
| `opts.minimize?` | `boolean` |
| `opts.nopechaKey?` | `string` |
#### Returns
##### Returns
`Promise`<`void`\>
#### Defined in
[src/openai-auth.ts:373](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/openai-auth.ts#L373)
___
### isRelevantRequest
**isRelevantRequest**(`url`): `boolean`
#### Parameters
| Name | Type |
| :------ | :------ |
| `url` | `string` |
#### Returns
`boolean`
#### Defined in
[src/utils.ts:39](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/utils.ts#L39)
___
### markdownToText
**markdownToText**(`markdown?`): `string`
#### Parameters
| Name | Type |
| :------ | :------ |
| `markdown?` | `string` |
#### Returns
`string`
#### Defined in
[src/utils.ts:12](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/utils.ts#L12)
___
### maximizePage
**maximizePage**(`page`): `Promise`<`void`\>
#### Parameters
| Name | Type |
| :------ | :------ |
| `page` | `Page` |
#### Returns
`Promise`<`void`\>
#### Defined in
[src/utils.ts:29](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/utils.ts#L29)
___
### minimizePage
**minimizePage**(`page`): `Promise`<`void`\>
#### Parameters
| Name | Type |
| :------ | :------ |
| `page` | `Page` |
#### Returns
`Promise`<`void`\>
#### Defined in
[src/utils.ts:19](https://github.com/transitive-bullshit/chatgpt-api/blob/d1b74a8/src/utils.ts#L19)
[src/types.ts:87](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L87)

Wyświetl plik

@ -0,0 +1,102 @@
[chatgpt](../readme.md) / [Exports](../modules.md) / openai
# Namespace: openai
## Table of contents
### Interfaces
- [ChatCompletionRequestMessage](../interfaces/openai.ChatCompletionRequestMessage.md)
- [ChatCompletionResponseMessage](../interfaces/openai.ChatCompletionResponseMessage.md)
- [CreateChatCompletionDeltaResponse](../interfaces/openai.CreateChatCompletionDeltaResponse.md)
- [CreateChatCompletionRequest](../interfaces/openai.CreateChatCompletionRequest.md)
- [CreateChatCompletionResponse](../interfaces/openai.CreateChatCompletionResponse.md)
- [CreateChatCompletionResponseChoicesInner](../interfaces/openai.CreateChatCompletionResponseChoicesInner.md)
- [CreateCompletionResponseUsage](../interfaces/openai.CreateCompletionResponseUsage.md)
### Type Aliases
- [ChatCompletionRequestMessageRoleEnum](openai.md#chatcompletionrequestmessageroleenum)
- [ChatCompletionResponseMessageRoleEnum](openai.md#chatcompletionresponsemessageroleenum)
- [CreateChatCompletionRequestStop](openai.md#createchatcompletionrequeststop)
### Variables
- [ChatCompletionRequestMessageRoleEnum](openai.md#chatcompletionrequestmessageroleenum-1)
- [ChatCompletionResponseMessageRoleEnum](openai.md#chatcompletionresponsemessageroleenum-1)
## Type Aliases
### ChatCompletionRequestMessageRoleEnum
Ƭ **ChatCompletionRequestMessageRoleEnum**: typeof [`ChatCompletionRequestMessageRoleEnum`](openai.md#chatcompletionrequestmessageroleenum-1)[keyof typeof [`ChatCompletionRequestMessageRoleEnum`](openai.md#chatcompletionrequestmessageroleenum-1)]
#### Defined in
[src/types.ts:219](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L219)
[src/types.ts:224](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L224)
___
### ChatCompletionResponseMessageRoleEnum
Ƭ **ChatCompletionResponseMessageRoleEnum**: typeof [`ChatCompletionResponseMessageRoleEnum`](openai.md#chatcompletionresponsemessageroleenum-1)[keyof typeof [`ChatCompletionResponseMessageRoleEnum`](openai.md#chatcompletionresponsemessageroleenum-1)]
#### Defined in
[src/types.ts:245](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L245)
[src/types.ts:250](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L250)
___
### CreateChatCompletionRequestStop
Ƭ **CreateChatCompletionRequestStop**: `string`[] \| `string`
**`Export`**
#### Defined in
[src/types.ts:336](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L336)
## Variables
### ChatCompletionRequestMessageRoleEnum
`Const` **ChatCompletionRequestMessageRoleEnum**: `Object`
#### Type declaration
| Name | Type |
| :------ | :------ |
| `Assistant` | ``"assistant"`` |
| `System` | ``"system"`` |
| `User` | ``"user"`` |
#### Defined in
[src/types.ts:219](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L219)
[src/types.ts:224](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L224)
___
### ChatCompletionResponseMessageRoleEnum
`Const` **ChatCompletionResponseMessageRoleEnum**: `Object`
#### Type declaration
| Name | Type |
| :------ | :------ |
| `Assistant` | ``"assistant"`` |
| `System` | ``"system"`` |
| `User` | ``"user"`` |
#### Defined in
[src/types.ts:245](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L245)
[src/types.ts:250](https://github.com/transitive-bullshit/chatgpt-api/blob/9eac18f/src/types.ts#L250)

Wyświetl plik

@ -1,56 +1,22 @@
chatgpt / [Exports](modules.md)
# Update December 18, 2022 <!-- omit in toc -->
On December 11th, OpenAI added Cloudflare protections that make it more difficult to access the unofficial API.
To circumvent these protections, we've added a **fully automated browser-based solution**, which uses Puppeteer and CAPTCHA solvers under the hood. 🔥
```ts
import { ChatGPTAPIBrowser } from 'chatgpt'
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
})
await api.initSession()
const result = await api.sendMessage('Hello World!')
console.log(result.response)
```
This solution is not lightweight, but it does work a lot more consistently than the previous REST API-based approach. For example, I'm currently using this approach to automate 10 concurrent OpenAI accounts for my [Twitter bot](https://github.com/transitive-bullshit/chatgpt-twitter-bot). 😂
To use the updated version, **make sure you're using the latest version of this package and Node.js >= 18**. Then update your code following the examples below, paying special attention to the sections on [Authentication](#authentication), [Restrictions](#restrictions), and [CAPTCHAs](#captchas).
We recently added support for CAPTCHA automation using either [nopecha](https://nopecha.com/) or [2captcha](https://2captcha.com). Keep in mind that this package will be updated to use the official API as soon as it's released, so things should get much easier over time. 💪
Lastly, please consider starring this repo and <a href="https://twitter.com/transitive_bs">following me on twitter <img src="https://storage.googleapis.com/saasify-assets/twitter-logo.svg" alt="twitter" height="24px" align="center"></a> to help support the project.
Thanks && cheers,
[Travis](https://twitter.com/transitive_bs)
---
<p align="center">
<img alt="Example usage" src="/media/demo.gif">
</p>
# ChatGPT API <!-- omit in toc -->
> Node.js client for the unofficial [ChatGPT](https://openai.com/blog/chatgpt/) API.
> Node.js client for the official [ChatGPT](https://openai.com/blog/chatgpt/) API.
[![NPM](https://img.shields.io/npm/v/chatgpt.svg)](https://www.npmjs.com/package/chatgpt) [![Build Status](https://github.com/transitive-bullshit/chatgpt-api/actions/workflows/test.yml/badge.svg)](https://github.com/transitive-bullshit/chatgpt-api/actions/workflows/test.yml) [![MIT License](https://img.shields.io/badge/license-MIT-blue)](https://github.com/transitive-bullshit/chatgpt-api/blob/main/license) [![Prettier Code Formatting](https://img.shields.io/badge/code_style-prettier-brightgreen.svg)](https://prettier.io)
- [Intro](#intro)
- [Updates](#updates)
- [CLI](#cli)
- [Install](#install)
- [Usage](#usage)
- [Docs](#docs)
- [Demos](#demos)
- [Authentication](#authentication)
- [CAPTCHAs](#captchas)
- [Using Proxies](#using-proxies)
- [Restrictions](#restrictions)
- [Usage - ChatGPTAPI](#usage---chatgptapi)
- [Usage - ChatGPTUnofficialProxyAPI](#usage---chatgptunofficialproxyapi)
- [Reverse Proxy](#reverse-proxy)
- [Access Token](#access-token)
- [Docs](#docs)
- [Demos](#demos)
- [Projects](#projects)
- [Compatibility](#compatibility)
- [Credits](#credits)
@ -60,138 +26,351 @@ Thanks && cheers,
This package is a Node.js wrapper around [ChatGPT](https://openai.com/blog/chatgpt) by [OpenAI](https://openai.com). TS batteries included. ✨
You can use it to start building projects powered by ChatGPT like chatbots, websites, etc...
<p align="center">
<img alt="Example usage" src="/media/demo.gif">
</p>
## Updates
<details open>
<summary><strong>March 1, 2023</strong></summary>
<br/>
The [official OpenAI chat completions API](https://platform.openai.com/docs/guides/chat) has been released, and it is now the default for this package! 🔥
| Method | Free? | Robust? | Quality? |
| --------------------------- | ------ | -------- | ----------------------- |
| `ChatGPTAPI` | ❌ No | ✅ Yes | ✅️ Real ChatGPT models |
| `ChatGPTUnofficialProxyAPI` | ✅ Yes | ☑️ Maybe | ✅ Real ChatGPT |
**Note**: We strongly recommend using `ChatGPTAPI` since it uses the officially supported API from OpenAI. We may remove support for `ChatGPTUnofficialProxyAPI` in a future release.
1. `ChatGPTAPI` - Uses the `gpt-3.5-turbo-0301` model with the official OpenAI chat completions API (official, robust approach, but it's not free)
2. `ChatGPTUnofficialProxyAPI` - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
</details>
<details>
<summary><strong>Previous Updates</strong></summary>
<br/>
<details>
<summary><strong>Feb 19, 2023</strong></summary>
<br/>
We now provide three ways of accessing the unofficial ChatGPT API, all of which have tradeoffs:
| Method | Free? | Robust? | Quality? |
| --------------------------- | ------ | -------- | ----------------- |
| `ChatGPTAPI` | ❌ No | ✅ Yes | ☑️ Mimics ChatGPT |
| `ChatGPTUnofficialProxyAPI` | ✅ Yes | ☑️ Maybe | ✅ Real ChatGPT |
| `ChatGPTAPIBrowser` (v3) | ✅ Yes | ❌ No | ✅ Real ChatGPT |
**Note**: I recommend that you use either `ChatGPTAPI` or `ChatGPTUnofficialProxyAPI`.
1. `ChatGPTAPI` - (Used to use) `text-davinci-003` to mimic ChatGPT via the official OpenAI completions API (most robust approach, but it's not free and doesn't use a model fine-tuned for chat)
2. `ChatGPTUnofficialProxyAPI` - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
3. `ChatGPTAPIBrowser` - (_deprecated_; v3.5.1 of this package) Uses Puppeteer to access the official ChatGPT webapp (uses the real ChatGPT, but very flaky, heavyweight, and error prone)
</details>
<details>
<summary><strong>Feb 5, 2023</strong></summary>
<br/>
OpenAI has disabled the leaked chat model we were previously using, so we're now defaulting to `text-davinci-003`, which is not free.
We've found several other hidden, fine-tuned chat models, but OpenAI keeps disabling them, so we're searching for alternative workarounds.
</details>
<details>
<summary><strong>Feb 1, 2023</strong></summary>
<br/>
This package no longer requires any browser hacks – **it is now using the official OpenAI completions API** with a leaked model that ChatGPT uses under the hood. 🔥
```ts
import { ChatGPTAPI } from 'chatgpt'
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY
})
const res = await api.sendMessage('Hello World!')
console.log(res.text)
```
Please upgrade to `chatgpt@latest` (at least [v4.0.0](https://github.com/transitive-bullshit/chatgpt-api/releases/tag/v4.0.0)). The updated version is **significantly more lightweight and robust** compared with previous versions. You also don't have to worry about IP issues or rate limiting.
Huge shoutout to [@waylaidwanderer](https://github.com/waylaidwanderer) for discovering the leaked chat model!
</details>
</details>
If you run into any issues, we do have a pretty active [Discord](https://discord.gg/v9gERj825w) with a bunch of ChatGPT hackers from the Node.js & Python communities.
Lastly, please consider starring this repo and <a href="https://twitter.com/transitive_bs">following me on twitter <img src="https://storage.googleapis.com/saasify-assets/twitter-logo.svg" alt="twitter" height="24px" align="center"></a> to help support the project.
Thanks && cheers,
[Travis](https://twitter.com/transitive_bs)
## CLI
To run the CLI, you'll need an [OpenAI API key](https://platform.openai.com/overview):
```bash
export OPENAI_API_KEY="sk-TODO"
npx chatgpt "your prompt here"
```
By default, the response is streamed to stdout, the results are stored in a local config file, and every invocation starts a new conversation. You can use `-c` to continue the previous conversation and `--no-stream` to disable streaming.
```
Usage:
$ chatgpt <prompt>
Commands:
<prompt> Ask ChatGPT a question
rm-cache Clears the local message cache
ls-cache Prints the local message cache path
For more info, run any command with the `--help` flag:
$ chatgpt --help
$ chatgpt rm-cache --help
$ chatgpt ls-cache --help
Options:
-c, --continue Continue last conversation (default: false)
-d, --debug Enables debug logging (default: false)
-s, --stream Streams the response (default: true)
-s, --store Enables the local message cache (default: true)
-t, --timeout Timeout in milliseconds
-k, --apiKey OpenAI API key
-n, --conversationName Unique name for the conversation
-h, --help Display this message
-v, --version Display version number
```
## Install
```bash
npm install chatgpt puppeteer
npm install chatgpt
```
`puppeteer` is an optional peer dependency used to automate bypassing the Cloudflare protections via `getOpenAIAuth`. The main API wrapper uses `fetch` directly.
Make sure you're using `node >= 18` so `fetch` is available (or `node >= 14` if you install a [fetch polyfill](https://github.com/developit/unfetch#usage-as-a-polyfill)).
## Usage
To use this module from Node.js, you need to pick between two methods:
| Method | Free? | Robust? | Quality? |
| --------------------------- | ------ | -------- | ----------------------- |
| `ChatGPTAPI` | ❌ No | ✅ Yes | ✅️ Real ChatGPT models |
| `ChatGPTUnofficialProxyAPI` | ✅ Yes | ☑️ Maybe | ✅ Real ChatGPT |
1. `ChatGPTAPI` - Uses the `gpt-3.5-turbo-0301` model with the official OpenAI chat completions API (official, robust approach, but it's not free). You can override the model, completion params, and system message to fully customize your assistant.
2. `ChatGPTUnofficialProxyAPI` - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
Both approaches have very similar APIs, so it should be simple to swap between them.
**Note**: We strongly recommend using `ChatGPTAPI` since it uses the officially supported API from OpenAI. We may remove support for `ChatGPTUnofficialProxyAPI` in a future release.
### Usage - ChatGPTAPI
Sign up for an [OpenAI API key](https://platform.openai.com/overview) and store it in your environment.
```ts
import { ChatGPTAPIBrowser } from 'chatgpt'
import { ChatGPTAPI } from 'chatgpt'
async function example() {
// use puppeteer to bypass cloudflare (headful because of captchas)
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY
})
await api.initSession()
const result = await api.sendMessage('Hello World!')
console.log(result.response)
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
```
<details>
<summary>Or, if you want to use the REST-based version:</summary>
You can override the default `model` (`gpt-3.5-turbo-0301`) and any [OpenAI chat completion params](https://platform.openai.com/docs/api-reference/chat/create) using `completionParams`:
```ts
import { ChatGPTAPI, getOpenAIAuth } from 'chatgpt'
async function example() {
// use puppeteer to bypass cloudflare (headful because of captchas)
const openAIAuth = await getOpenAIAuth({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
})
const api = new ChatGPTAPI({ ...openAIAuth })
await api.initSession()
// send a message and wait for the response
const result = await api.sendMessage('Write a python version of bubble sort.')
// result.response is a markdown-formatted string
console.log(result.response)
}
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
completionParams: {
temperature: 0.5,
top_p: 0.8
}
})
```
</details>
ChatGPT responses are formatted as markdown by default. If you want to work with plaintext instead, you can use:
If you want to track the conversation, you'll need to pass the `parentMessageId` like this:
```ts
const api = new ChatGPTAPIBrowser({ email, password, markdown: false })
```
If you want to track the conversation, use the `conversationId` and `messageId` in the result object, and pass them to `sendMessage` as `conversationId` and `parentMessageId` respectively.
```ts
const api = new ChatGPTAPIBrowser({ email, password })
await api.initSession()
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
// send a message and wait for the response
let res = await api.sendMessage('What is OpenAI?')
console.log(res.response)
console.log(res.text)
// send a follow-up
res = await api.sendMessage('Can you expand on that?', {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
})
console.log(res.response)
console.log(res.text)
// send another follow-up
// send a follow-up
res = await api.sendMessage('What were we talking about?', {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
})
console.log(res.response)
console.log(res.text)
```
Sometimes, ChatGPT will hang for an extended period of time before beginning to respond. This may be due to rate limiting or it may be due to OpenAI's servers being overloaded.
You can add streaming via the `onProgress` handler:
To mitigate these issues, you can add a timeout like this:
```ts
const res = await api.sendMessage('Write a 500 word essay on frogs.', {
// print the partial response as the AI is "typing"
onProgress: (partialResponse) => console.log(partialResponse.text)
})
// print the full text at the end
console.log(res.text)
```
You can add a timeout using the `timeoutMs` option:
```ts
// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage('this is a timeout test', {
timeoutMs: 2 * 60 * 1000
const response = await api.sendMessage(
'write me a really really long essay on frogs',
{
timeoutMs: 2 * 60 * 1000
}
)
```
If you want to see more info about what's actually being sent to [OpenAI's chat completions API](https://platform.openai.com/docs/api-reference/chat/create), set the `debug: true` option in the `ChatGPTAPI` constructor:
```ts
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: true
})
```
We default to a basic `systemMessage`. You can override this in either the `ChatGPTAPI` constructor or `sendMessage`:
```ts
const res = await api.sendMessage('what is the answer to the universe?', {
systemMessage: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each responseIf you are generating a list, do not have too many items.
Current date: ${new Date().toISOString()}\n\n`
})
```
Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to `4096`).
<details>
<summary>Usage in CommonJS (Dynamic import)</summary>
```js
async function example() {
// To use ESM in CommonJS, you can use a dynamic import
const { ChatGPTAPI, getOpenAIAuth } = await import('chatgpt')
const { ChatGPTAPI } = await import('chatgpt')
const openAIAuth = await getOpenAIAuth({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
})
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
const api = new ChatGPTAPI({ ...openAIAuth })
await api.initSession()
const result = await api.sendMessage('Hello World!')
console.log(result)
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
```
</details>
### Docs
### Usage - ChatGPTUnofficialProxyAPI
See the [auto-generated docs](./docs/classes/ChatGPTAPI.md) for more info on methods and parameters. Here are the [docs](./docs/classes/ChatGPTAPIBrowser.md) for the browser-based version.
The API for `ChatGPTUnofficialProxyAPI` is almost exactly the same. You just need to provide a ChatGPT `accessToken` instead of an OpenAI API key.
### Demos
```ts
import { ChatGPTUnofficialProxyAPI } from 'chatgpt'
async function example() {
const api = new ChatGPTUnofficialProxyAPI({
accessToken: process.env.OPENAI_ACCESS_TOKEN
})
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
```
See [demos/demo-reverse-proxy](./demos/demo-reverse-proxy.ts) for a full example:
```bash
npx tsx demos/demo-reverse-proxy.ts
```
`ChatGPTUnofficialProxyAPI` messages also contain a `conversationid` in addition to `parentMessageId`, since the ChatGPT webapp can't reference messages across different accounts & conversations.
#### Reverse Proxy
You can override the reverse proxy by passing `apiReverseProxyUrl`:
```ts
const api = new ChatGPTUnofficialProxyAPI({
accessToken: process.env.OPENAI_ACCESS_TOKEN,
apiReverseProxyUrl: 'https://your-example-server.com/api/conversation'
})
```
Known reverse proxies run by community members include:
| Reverse Proxy URL | Author | Rate Limits | Last Checked |
| ------------------------------------------------ | -------------------------------------------- | ----------------- | ------------ |
| `https://bypass.churchless.tech/api/conversation` | [@acheong08](https://github.com/acheong08) | 5 req / 10 seconds by IP | 3/24/2023 |
| `https://api.pawan.krd/backend-api/conversation` | [@PawanOsman](https://github.com/PawanOsman) | 50 req / 15 seconds (~3 r/s) | 3/23/2023 |
Note: info on how the reverse proxies work is not being published at this time in order to prevent OpenAI from disabling access.
#### Access Token
To use `ChatGPTUnofficialProxyAPI`, you'll need an OpenAI access token from the ChatGPT webapp. To do this, you can use any of the following methods which take an `email` and `password` and return an access token:
- Node.js libs
- [ericlewis/openai-authenticator](https://github.com/ericlewis/openai-authenticator)
- [michael-dm/openai-token](https://github.com/michael-dm/openai-token)
- [allanoricil/chat-gpt-authenticator](https://github.com/AllanOricil/chat-gpt-authenticator)
- Python libs
- [acheong08/OpenAIAuth](https://github.com/acheong08/OpenAIAuth)
These libraries work with email + password accounts (e.g., they do not support accounts where you auth via Microsoft / Google).
Alternatively, you can manually get an `accessToken` by logging in to the ChatGPT webapp and then opening `https://chat.openai.com/api/auth/session`, which will return a JSON object containing your `accessToken` string.
Access tokens last for days.
**Note**: using a reverse proxy will expose your access token to a third-party. There shouldn't be any adverse effects possible from this, but please consider the risks before using this method.
## Docs
See the [auto-generated docs](./docs/classes/ChatGPTAPI.md) for more info on methods and parameters.
## Demos
Most of the demos use `ChatGPTAPI`. It should be pretty easy to convert them to use `ChatGPTUnofficialProxyAPI` if you'd rather use that approach. The only thing that needs to change is how you initialize the api with an `accessToken` instead of an `apiKey`.
To run the included demos:
1. clone repo
2. install node deps
3. set `OPENAI_EMAIL` and `OPENAI_PASSWORD` in .env
3. set `OPENAI_API_KEY` in .env
A [basic demo](./demos/demo.ts) is included for testing purposes:
@ -199,105 +378,31 @@ A [basic demo](./demos/demo.ts) is included for testing purposes:
npx tsx demos/demo.ts
```
A [conversation demo](./demos/demo-conversation.ts) is also included:
A [demo showing on progress handler](./demos/demo-on-progress.ts):
```bash
npx tsx demos/demo-on-progress.ts
```
The on progress demo uses the optional `onProgress` parameter to `sendMessage` to receive intermediary results as ChatGPT is "typing".
A [conversation demo](./demos/demo-conversation.ts):
```bash
npx tsx demos/demo-conversation.ts
```
A [browser-based conversation demo](./demos/demo-conversation-browser.ts) is also included:
A [persistence demo](./demos/demo-persistence.ts) shows how to store messages in Redis for persistence:
```bash
npx tsx demos/demo-conversation-browser.ts
npx tsx demos/demo-persistence.ts
```
### Authentication
Any [keyv adaptor](https://github.com/jaredwray/keyv) is supported for persistence, and there are overrides if you'd like to use a different way of storing / retrieving messages.
The authentication section relates to the REST-based version (using `getOpenAIAuth` + `ChatGPTAPI`). The browser-based solution, `ChatGPTAPIBrowser`, takes care of all the authentication for you.
Note that persisting message is required for remembering the context of previous conversations beyond the scope of the current Node.js process, since by default, we only store messages in memory. Here's an [external demo](https://github.com/transitive-bullshit/chatgpt-twitter-bot/blob/main/src/index.ts#L86-L95) of using a completely custom database solution to persist messages.
On December 11, 2022, OpenAI added some additional Cloudflare protections which make it more difficult to access the unofficial API.
You'll need a valid OpenAI "session token" and Cloudflare "clearance token" in order to use the API.
We've provided an automated, Puppeteer-based solution `getOpenAIAuth` to fetch these for you, but you may still run into cases where you have to manually pass the CAPTCHA. We're working on a solution to automate this further.
You can also get these tokens manually, but keep in mind that the `clearanceToken` only lasts for max 2 hours.
<details>
<summary>Getting tokens manually</summary>
To get session token manually:
1. Go to https://chat.openai.com/chat and log in or sign up.
2. Open dev tools.
3. Open `Application` > `Cookies`.
![ChatGPT cookies](./media/session-token.png)
4. Copy the value for `__Secure-next-auth.session-token` and save it to your environment. This will be your `sessionToken`.
5. Copy the value for `cf_clearance` and save it to your environment. This will be your `clearanceToken`.
6. Copy the value of the `user-agent` header from any request in your `Network` tab. This will be your `userAgent`.
Pass `sessionToken`, `clearanceToken`, and `userAgent` to the `ChatGPTAPI` constructor.
</details>
> **Note**
> This package will switch to using the official API once it's released, which will make this process much simpler.
### CAPTCHAs
The browser portions of this package use Puppeteer to automate as much as possible, including solving all CAPTCHAs. 🔥
Basic Cloudflare CAPTCHAs are handled by default, but if you want to automate the email + password Recaptchas, you'll need to sign up for one of these paid providers:
- [nopecha](https://nopecha.com/) - Uses AI to solve CAPTCHAS
- Faster and cheaper
- Set the `NOPECHA_KEY` env var to your nopecha API key
- [Demo video](https://user-images.githubusercontent.com/552829/208235991-de4890f2-e7ba-4b42-bf55-4fcd792d4b19.mp4) of nopecha solving the login Recaptcha (41 seconds)
- [2captcha](https://2captcha.com) - Uses real people to solve CAPTCHAS
- More well-known solution that's been around longer
- Set the `CAPTCHA_TOKEN` env var to your 2captcha API token
Alternatively, if your OpenAI account uses Google Auth, you shouldn't encounter any of the more complicated Recaptchas — and can avoid using these third-party providers. To use Google auth, make sure your OpenAI account is using Google and then set `isGoogleLogin` to `true` whenever you're passing your `email` and `password`. For example:
```ts
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD,
isGoogleLogin: true
})
```
### Using Proxies
The browser implementation supports setting a proxy server. This is useful if you're running into rate limiting issues or if you want to use a proxy to hide your IP address.
To use a proxy, pass the `proxyServer` option to the `ChatGPTAPIBrowser` constructor, or simply set the `PROXY_SERVER` env var. For more information on the format, see [here](https://www.chromium.org/developers/design-documents/network-settings).
```ts
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD,
proxyServer: '<ip>:<port>'
})
```
You can also set the `PROXY_VALIDATE_IP` env var to your proxy's IP address. This will be used to validate that the proxy is working correctly, and will throw an error if it's not.
### Restrictions
These restrictions are for the `getOpenAIAuth` + `ChatGPTAPI` solution, which uses the unofficial API. The browser-based solution, `ChatGPTAPIBrowser`, generally doesn't have any of these restrictions.
**Please read carefully**
- You must use `node >= 18` at the moment. I'm using `v19.2.0` in my testing.
- Cloudflare `cf_clearance` **tokens expire after 2 hours**, so right now we recommend that you refresh your `cf_clearance` token every hour or so.
- Your `user-agent` and `IP address` **must match** from the real browser window you're logged in with to the one you're using for `ChatGPTAPI`.
- This means that you currently can't log in with your laptop and then run the bot on a server or proxy somewhere.
- Cloudflare will still sometimes ask you to complete a CAPTCHA, so you may need to keep an eye on it and manually resolve the CAPTCHA.
- You should not be using this account while the bot is using it, because that browser window may refresh one of your tokens and invalidate the bot's session.
> **Note**
> Prior to v1.0.0, this package used a headless browser via [Playwright](https://playwright.dev/) to automate the web UI. Here are the [docs for the initial browser version](https://github.com/transitive-bullshit/chatgpt-api/tree/v0.4.2).
**Note**: Persistence is handled automatically when using `ChatGPTUnofficialProxyAPI` because it is connecting indirectly to ChatGPT.
## Projects
@ -305,6 +410,8 @@ All of these awesome projects are built using the `chatgpt` package. 🤯
- [Twitter Bot](https://github.com/transitive-bullshit/chatgpt-twitter-bot) powered by ChatGPT ✨
- Mention [@ChatGPTBot](https://twitter.com/ChatGPTBot) on Twitter with your prompt to try it out
- [ChatGPT API Server](https://github.com/waylaidwanderer/node-chatgpt-api) - API server for this package with support for multiple OpenAI accounts, proxies, and load-balancing requests between accounts.
- [ChatGPT Prompts](https://github.com/pacholoamit/chatgpt-prompts) - A collection of 140+ of the best ChatGPT prompts from the community.
- [Lovelines.xyz](https://lovelines.xyz?ref=chatgpt-api)
- [Chrome Extension](https://github.com/gragland/chatgpt-everywhere) ([demo](https://twitter.com/gabe_ragland/status/1599466486422470656))
- [VSCode Extension #1](https://github.com/mpociot/chatgpt-vscode) ([demo](https://twitter.com/marcelpociot/status/1599180144551526400), [updated version](https://github.com/timkmecl/chatgpt-vscode), [marketplace](https://marketplace.visualstudio.com/items?itemName=timkmecl.chatgpt))
@ -315,18 +422,27 @@ All of these awesome projects are built using the `chatgpt` package. 🤯
- [Raycast Extension #2](https://github.com/domnantas/raycast-chatgpt)
- [Telegram Bot #1](https://github.com/realies/chatgpt-telegram-bot)
- [Telegram Bot #2](https://github.com/dawangraoming/chatgpt-telegram-bot)
- [Telegram Bot #3](https://github.com/RainEggplant/chatgpt-telegram-bot) (group privacy mode, ID-based auth)
- [Telegram Bot #4](https://github.com/ArdaGnsrn/chatgpt-telegram) (queue system, ID-based chat thread)
- [Telegram Bot #5](https://github.com/azoway/chatgpt-telegram-bot) (group privacy mode, ID-based chat thread)
- [Deno Telegram Bot](https://github.com/Ciyou/chatbot-telegram)
- [Go Telegram Bot](https://github.com/m1guelpf/chatgpt-telegram)
- [Telegram Bot for YouTube Summaries](https://github.com/codextde/youtube-summary)
- [GitHub ProBot](https://github.com/oceanlvr/ChatGPTBot)
- [Discord Bot #1](https://github.com/onury5506/Discord-ChatGPT-Bot)
- [Discord Bot #2](https://github.com/Nageld/ChatGPT-Bot)
- [Discord Bot #3](https://github.com/leinstay/gptbot)
- [Discord Bot #4 (selfbot)](https://github.com/0x7030676e31/cumsocket)
- [Discord Bot #5](https://github.com/itskdhere/ChatGPT-Discord-BOT)
- [Discord Bot #6 (Shakespeare bot)](https://gist.github.com/TheBrokenRail/4b37e7c44e8f721d8bd845050d034c16)
- [Discord Bot #7](https://github.com/Elitezen/discordjs-chatgpt)
- [Zoom Chat](https://github.com/shixin-guo/my-bot)
- [WeChat Bot #1](https://github.com/AutumnWhj/ChatGPT-wechat-bot)
- [WeChat Bot #2](https://github.com/fuergaosi233/wechat-chatgpt)
- [WeChat Bot #3](https://github.com/wangrongding/wechat-bot)
- [WeChat Bot #3](https://github.com/wangrongding/wechat-bot) (
- [WeChat Bot #4](https://github.com/darknightlab/wechat-bot)
- [WeChat Bot #5](https://github.com/sunshanpeng/wechaty-chatgpt)
- [WeChat Bot #6](https://github.com/formulahendry/chatgpt-wechat-bot)
- [QQ Bot (plugin for Yunzai-bot)](https://github.com/ikechan8370/chatgpt-plugin)
- [QQ Bot (plugin for KiviBot)](https://github.com/KiviBotLab/kivibot-plugin-chatgpt)
- [QQ Bot (oicq)](https://github.com/easydu2002/chat_gpt_oicq)
@ -336,32 +452,54 @@ All of these awesome projects are built using the `chatgpt` package. 🤯
- [Flutter ChatGPT API](https://github.com/coskuncay/flutter_chatgpt_api)
- [Carik Bot](https://github.com/luridarmawan/Carik)
- [Github Action for reviewing PRs](https://github.com/kxxt/chatgpt-action/)
- [WhatsApp Bot #1](https://github.com/pascalroget/whatsgpt) (multi-user support)
- [WhatsApp Bot #1](https://github.com/askrella/whatsapp-chatgpt) (DALL-E + Whisper support 💪)
- [WhatsApp Bot #2](https://github.com/amosayomide05/chatgpt-whatsapp-bot)
- [WhatsApp Bot #3](https://github.com/navopw/whatsapp-chatgpt)
- [Matrix Bot](https://github.com/jakecoppinger/matrix-chatgpt-bot)
- [WhatsApp Bot #3](https://github.com/pascalroget/whatsgpt) (multi-user support)
- [WhatsApp Bot #4](https://github.com/noelzappy/chatgpt-whatsapp) (schedule periodic messages)
- [WhatsApp Bot #5](https://github.com/hujanais/bs-chat-gpt3-api) (RaspberryPi + ngrok + Twilio)
- [WhatsApp Bot #6](https://github.com/dannysantino/whatsgpt) (Session and chat history storage with MongoStore)
- [Matrix Bot](https://github.com/matrixgpt/matrix-chatgpt-bot)
- [Rental Cover Letter Generator](https://sharehouse.app/ai)
- [Assistant CLI](https://github.com/diciaup/assistant-cli)
- [Teams Bot](https://github.com/formulahendry/chatgpt-teams-bot)
- [Askai](https://github.com/yudax42/askai)
- [TalkGPT](https://github.com/ShadovvBeast/TalkGPT)
- [ChatGPT With Voice](https://github.com/thanhsonng/chatgpt-voice)
- [iOS Shortcut](https://github.com/leecobaby/shortcuts/blob/master/other/ChatGPT_EN.md)
- [Slack Bot](https://github.com/trietphm/chatgpt-slackbot/)
- [Slack Bot #1](https://github.com/trietphm/chatgpt-slackbot/)
- [Slack Bot #2](https://github.com/lokwkin/chatgpt-slackbot-node/) (with queueing mechanism)
- [Slack Bot #3](https://github.com/NessunKim/slack-chatgpt/)
- [Slack Bot #4](https://github.com/MarkusGalant/chatgpt-slackbot-serverless/) ( Serverless AWS Lambda )
- [Electron Bot](https://github.com/ShiranAbir/chaty)
- [Kodyfire CLI](https://github.com/nooqta/chatgpt-kodyfire)
- [Twitch Bot](https://github.com/BennyDeeDev/chatgpt-twitch-bot)
- [Continuous Conversation](https://github.com/DanielTerletzkiy/chat-gtp-assistant)
- [Figma plugin](https://github.com/frederickk/chatgpt-figma-plugin)
- [NestJS server](https://github.com/RusDyn/chatgpt_nestjs_server)
- [NestJS ChatGPT Starter Boilerplate](https://github.com/mitkodkn/nestjs-chatgpt-starter)
- [Wordsmith: Add-in for Microsoft Word](https://github.com/xtremehpx/Wordsmith)
- [QuizGPT: Create Kahoot quizzes with ChatGPT](https://github.com/Kladdy/quizgpt)
- [openai-chatgpt: Talk to ChatGPT from the terminal](https://github.com/gmpetrov/openai-chatgpt)
- [Clippy the Saleforce chatbot](https://github.com/sebas00/chatgptclippy) ClippyJS joke bot
- [ai-assistant](https://github.com/youking-lib/ai-assistant) Chat assistant
- [Feishu Bot](https://github.com/linjungz/feishu-chatgpt-bot)
- [DomainGPT: Discover available domain names](https://github.com/billylo1/DomainGPT)
- [AI Poem Generator](https://aipoemgenerator.com/)
- [Next.js ChatGPT With Firebase](https://github.com/youngle316/chatgpt)
- [ai-commit – GPT-3 Commit Message Generator](https://github.com/insulineru/ai-commit)
- [AItinerary – ChatGPT itinerary Generator](https://aitinerary.ai)
- [wechaty-chatgpt - A chatbot based on Wechaty & ChatGPT](https://github.com/zhengxs2018/wechaty-chatgpt)
- [Julius GPT](https://github.com/christophebe/julius-gpt) : Generate and publish your content from the command line with the help of AI (GPT).
If you create a cool integration, feel free to open a PR and add it to the list.
## Compatibility
This package is ESM-only. It supports:
- Node.js >= 18
- Node.js 17, 16, and 14 were supported in earlier versions, but OpenAI's Cloudflare update caused a bug with `undici` on v17 and v16 that needs investigation. So for now, use `node >= 18`
- We recommend against using `chatgpt` from client-side browser code because it would expose your private session token
- This package is ESM-only.
- This package supports `node >= 14`.
- This module assumes that `fetch` is installed.
- In `node >= 18`, it's installed by default.
- In `node < 18`, you need to install a polyfill like `unfetch/polyfill` ([guide](https://github.com/developit/unfetch#usage-as-a-polyfill)) or `isomorphic-fetch` ([guide](https://github.com/matthew-andrews/isomorphic-fetch#readme)).
- If you want to build a website using `chatgpt`, we recommend using it only from your backend API
## Credits

Wyświetl plik

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2022 Travis Fischer
Copyright (c) 2023 Travis Fischer
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

10346
package-lock.json wygenerowano 100644

Plik diff jest za duży Load Diff

Wyświetl plik

@ -1,7 +1,7 @@
{
"name": "chatgpt",
"version": "3.3.3",
"description": "Node.js client for the unofficial ChatGPT API.",
"version": "5.1.4",
"description": "Node.js client for the official ChatGPT API.",
"author": "Travis Fischer <travis@transitivebullsh.it>",
"repository": "transitive-bullshit/chatgpt-api",
"license": "MIT",
@ -17,10 +17,11 @@
},
"files": [
"build",
"third-party"
"bin"
],
"bin": "./bin/cli.js",
"engines": {
"node": ">=18"
"node": ">=14"
},
"scripts": {
"build": "tsup",
@ -36,62 +37,48 @@
"test:prettier": "prettier '**/*.{js,jsx,ts,tsx}' --check"
},
"dependencies": {
"delay": "^5.0.0",
"@dqbd/tiktoken": "^1.0.2",
"cac": "^6.7.14",
"conf": "^11.0.1",
"eventsource-parser": "^0.0.5",
"expiry-map": "^2.0.0",
"html-to-md": "^0.8.3",
"p-timeout": "^6.0.0",
"puppeteer-extra": "^3.3.4",
"puppeteer-extra-plugin-recaptcha": "npm:@fisch0920/puppeteer-extra-plugin-recaptcha@^3.6.6",
"puppeteer-extra-plugin-stealth": "^2.11.1",
"random": "^4.1.0",
"remark": "^14.0.2",
"strip-markdown": "^5.0.0",
"keyv": "^4.5.2",
"p-timeout": "^6.1.1",
"quick-lru": "^6.1.1",
"read-pkg-up": "^9.1.0",
"uuid": "^9.0.0"
},
"devDependencies": {
"@trivago/prettier-plugin-sort-imports": "^4.0.0",
"@types/node": "^18.11.9",
"@types/uuid": "^9.0.0",
"ava": "^5.1.0",
"@keyv/redis": "^2.5.6",
"@trivago/prettier-plugin-sort-imports": "^4.1.1",
"@types/node": "^18.15.10",
"@types/uuid": "^9.0.1",
"del-cli": "^5.0.0",
"dotenv-safe": "^8.2.0",
"husky": "^8.0.2",
"lint-staged": "^13.0.3",
"husky": "^8.0.3",
"lint-staged": "^13.2.0",
"npm-run-all": "^4.1.5",
"ora": "^6.1.2",
"prettier": "^2.8.0",
"puppeteer": "^19.4.0",
"tsup": "^6.5.0",
"tsx": "^3.12.1",
"typedoc": "^0.23.21",
"typedoc-plugin-markdown": "^3.13.6",
"typescript": "^4.9.3"
},
"peerDependencies": {
"puppeteer": "*"
"ora": "^6.3.0",
"prettier": "^2.8.7",
"tsup": "^6.7.0",
"tsx": "^3.12.6",
"typedoc": "^0.23.28",
"typedoc-plugin-markdown": "^3.14.0",
"typescript": "^4.9.5"
},
"lint-staged": {
"*.{ts,tsx}": [
"prettier --write"
]
},
"ava": {
"extensions": {
"ts": "module"
},
"nodeArguments": [
"--loader=tsx"
]
},
"keywords": [
"openai",
"chatgpt",
"chat",
"gpt",
"gpt-3",
"gpt3",
"gpt4",
"chatbot",
"chat",
"machine learning",
"conversation",
"conversational ai",

Plik diff jest za duży Load Diff

567
readme.md
Wyświetl plik

@ -1,54 +1,20 @@
# Update December 18, 2022 <!-- omit in toc -->
On December 11th, OpenAI added Cloudflare protections that make it more difficult to access the unofficial API.
To circumvent these protections, we've added a **fully automated browser-based solution**, which uses Puppeteer and CAPTCHA solvers under the hood. 🔥
```ts
import { ChatGPTAPIBrowser } from 'chatgpt'
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
})
await api.initSession()
const result = await api.sendMessage('Hello World!')
console.log(result.response)
```
This solution is not lightweight, but it does work a lot more consistently than the previous REST API-based approach. For example, I'm currently using this approach to automate 10 concurrent OpenAI accounts for my [Twitter bot](https://github.com/transitive-bullshit/chatgpt-twitter-bot). 😂
To use the updated version, **make sure you're using the latest version of this package and Node.js >= 18**. Then update your code following the examples below, paying special attention to the sections on [Authentication](#authentication), [Restrictions](#restrictions), and [CAPTCHAs](#captchas).
We recently added support for CAPTCHA automation using either [nopecha](https://nopecha.com/) or [2captcha](https://2captcha.com). Keep in mind that this package will be updated to use the official API as soon as it's released, so things should get much easier over time. 💪
Lastly, please consider starring this repo and <a href="https://twitter.com/transitive_bs">following me on twitter <img src="https://storage.googleapis.com/saasify-assets/twitter-logo.svg" alt="twitter" height="24px" align="center"></a> to help support the project.
Thanks && cheers,
[Travis](https://twitter.com/transitive_bs)
---
<p align="center">
<img alt="Example usage" src="/media/demo.gif">
</p>
# ChatGPT API <!-- omit in toc -->
> Node.js client for the unofficial [ChatGPT](https://openai.com/blog/chatgpt/) API.
> Node.js client for the official [ChatGPT](https://openai.com/blog/chatgpt/) API.
[![NPM](https://img.shields.io/npm/v/chatgpt.svg)](https://www.npmjs.com/package/chatgpt) [![Build Status](https://github.com/transitive-bullshit/chatgpt-api/actions/workflows/test.yml/badge.svg)](https://github.com/transitive-bullshit/chatgpt-api/actions/workflows/test.yml) [![MIT License](https://img.shields.io/badge/license-MIT-blue)](https://github.com/transitive-bullshit/chatgpt-api/blob/main/license) [![Prettier Code Formatting](https://img.shields.io/badge/code_style-prettier-brightgreen.svg)](https://prettier.io)
- [Intro](#intro)
- [Updates](#updates)
- [CLI](#cli)
- [Install](#install)
- [Usage](#usage)
- [Docs](#docs)
- [Demos](#demos)
- [Authentication](#authentication)
- [CAPTCHAs](#captchas)
- [Using Proxies](#using-proxies)
- [Restrictions](#restrictions)
- [Usage - ChatGPTAPI](#usage---chatgptapi)
- [Usage - ChatGPTUnofficialProxyAPI](#usage---chatgptunofficialproxyapi)
- [Reverse Proxy](#reverse-proxy)
- [Access Token](#access-token)
- [Docs](#docs)
- [Demos](#demos)
- [Projects](#projects)
- [Compatibility](#compatibility)
- [Credits](#credits)
@ -58,138 +24,355 @@ Thanks && cheers,
This package is a Node.js wrapper around [ChatGPT](https://openai.com/blog/chatgpt) by [OpenAI](https://openai.com). TS batteries included. ✨
You can use it to start building projects powered by ChatGPT like chatbots, websites, etc...
<p align="center">
<img alt="Example usage" src="/media/demo.gif">
</p>
## Updates
<details open>
<summary><strong>March 1, 2023</strong></summary>
<br/>
The [official OpenAI chat completions API](https://platform.openai.com/docs/guides/chat) has been released, and it is now the default for this package! 🔥
| Method | Free? | Robust? | Quality? |
| --------------------------- | ------ | -------- | ----------------------- |
| `ChatGPTAPI` | ❌ No | ✅ Yes | ✅️ Real ChatGPT models |
| `ChatGPTUnofficialProxyAPI` | ✅ Yes | ☑️ Maybe | ✅ Real ChatGPT |
**Note**: We strongly recommend using `ChatGPTAPI` since it uses the officially supported API from OpenAI. We may remove support for `ChatGPTUnofficialProxyAPI` in a future release.
1. `ChatGPTAPI` - Uses the `gpt-3.5-turbo-0301` model with the official OpenAI chat completions API (official, robust approach, but it's not free)
2. `ChatGPTUnofficialProxyAPI` - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
</details>
<details>
<summary><strong>Previous Updates</strong></summary>
<br/>
<details>
<summary><strong>Feb 19, 2023</strong></summary>
<br/>
We now provide three ways of accessing the unofficial ChatGPT API, all of which have tradeoffs:
| Method | Free? | Robust? | Quality? |
| --------------------------- | ------ | -------- | ----------------- |
| `ChatGPTAPI` | ❌ No | ✅ Yes | ☑️ Mimics ChatGPT |
| `ChatGPTUnofficialProxyAPI` | ✅ Yes | ☑️ Maybe | ✅ Real ChatGPT |
| `ChatGPTAPIBrowser` (v3) | ✅ Yes | ❌ No | ✅ Real ChatGPT |
**Note**: I recommend that you use either `ChatGPTAPI` or `ChatGPTUnofficialProxyAPI`.
1. `ChatGPTAPI` - (Used to use) `text-davinci-003` to mimic ChatGPT via the official OpenAI completions API (most robust approach, but it's not free and doesn't use a model fine-tuned for chat)
2. `ChatGPTUnofficialProxyAPI` - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
3. `ChatGPTAPIBrowser` - (_deprecated_; v3.5.1 of this package) Uses Puppeteer to access the official ChatGPT webapp (uses the real ChatGPT, but very flaky, heavyweight, and error prone)
</details>
<details>
<summary><strong>Feb 5, 2023</strong></summary>
<br/>
OpenAI has disabled the leaked chat model we were previously using, so we're now defaulting to `text-davinci-003`, which is not free.
We've found several other hidden, fine-tuned chat models, but OpenAI keeps disabling them, so we're searching for alternative workarounds.
</details>
<details>
<summary><strong>Feb 1, 2023</strong></summary>
<br/>
This package no longer requires any browser hacks – **it is now using the official OpenAI completions API** with a leaked model that ChatGPT uses under the hood. 🔥
```ts
import { ChatGPTAPI } from 'chatgpt'
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY
})
const res = await api.sendMessage('Hello World!')
console.log(res.text)
```
Please upgrade to `chatgpt@latest` (at least [v4.0.0](https://github.com/transitive-bullshit/chatgpt-api/releases/tag/v4.0.0)). The updated version is **significantly more lightweight and robust** compared with previous versions. You also don't have to worry about IP issues or rate limiting.
Huge shoutout to [@waylaidwanderer](https://github.com/waylaidwanderer) for discovering the leaked chat model!
</details>
</details>
If you run into any issues, we do have a pretty active [Discord](https://discord.gg/v9gERj825w) with a bunch of ChatGPT hackers from the Node.js & Python communities.
Lastly, please consider starring this repo and <a href="https://twitter.com/transitive_bs">following me on twitter <img src="https://storage.googleapis.com/saasify-assets/twitter-logo.svg" alt="twitter" height="24px" align="center"></a> to help support the project.
Thanks && cheers,
[Travis](https://twitter.com/transitive_bs)
## CLI
To run the CLI, you'll need an [OpenAI API key](https://platform.openai.com/overview):
```bash
export OPENAI_API_KEY="sk-TODO"
npx chatgpt "your prompt here"
```
By default, the response is streamed to stdout, the results are stored in a local config file, and every invocation starts a new conversation. You can use `-c` to continue the previous conversation and `--no-stream` to disable streaming.
```
Usage:
$ chatgpt <prompt>
Commands:
<prompt> Ask ChatGPT a question
rm-cache Clears the local message cache
ls-cache Prints the local message cache path
For more info, run any command with the `--help` flag:
$ chatgpt --help
$ chatgpt rm-cache --help
$ chatgpt ls-cache --help
Options:
-c, --continue Continue last conversation (default: false)
-d, --debug Enables debug logging (default: false)
-s, --stream Streams the response (default: true)
-s, --store Enables the local message cache (default: true)
-t, --timeout Timeout in milliseconds
-k, --apiKey OpenAI API key
-o, --apiOrg OpenAI API organization
-n, --conversationName Unique name for the conversation
-h, --help Display this message
-v, --version Display version number
```
## Install
```bash
npm install chatgpt puppeteer
npm install chatgpt
```
`puppeteer` is an optional peer dependency used to automate bypassing the Cloudflare protections via `getOpenAIAuth`. The main API wrapper uses `fetch` directly.
Make sure you're using `node >= 18` so `fetch` is available (or `node >= 14` if you install a [fetch polyfill](https://github.com/developit/unfetch#usage-as-a-polyfill)).
## Usage
To use this module from Node.js, you need to pick between two methods:
| Method | Free? | Robust? | Quality? |
| --------------------------- | ------ | -------- | ----------------------- |
| `ChatGPTAPI` | ❌ No | ✅ Yes | ✅️ Real ChatGPT models |
| `ChatGPTUnofficialProxyAPI` | ✅ Yes | ☑️ Maybe | ✅ Real ChatGPT |
1. `ChatGPTAPI` - Uses the `gpt-3.5-turbo-0301` model with the official OpenAI chat completions API (official, robust approach, but it's not free). You can override the model, completion params, and system message to fully customize your assistant.
2. `ChatGPTUnofficialProxyAPI` - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
Both approaches have very similar APIs, so it should be simple to swap between them.
**Note**: We strongly recommend using `ChatGPTAPI` since it uses the officially supported API from OpenAI. We may remove support for `ChatGPTUnofficialProxyAPI` in a future release.
### Usage - ChatGPTAPI
Sign up for an [OpenAI API key](https://platform.openai.com/overview) and store it in your environment.
```ts
import { ChatGPTAPIBrowser } from 'chatgpt'
import { ChatGPTAPI } from 'chatgpt'
async function example() {
// use puppeteer to bypass cloudflare (headful because of captchas)
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY
})
await api.initSession()
const result = await api.sendMessage('Hello World!')
console.log(result.response)
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
```
<details>
<summary>Or, if you want to use the REST-based version:</summary>
You can override the default `model` (`gpt-3.5-turbo-0301`) and any [OpenAI chat completion params](https://platform.openai.com/docs/api-reference/chat/create) using `completionParams`:
```ts
import { ChatGPTAPI, getOpenAIAuth } from 'chatgpt'
async function example() {
// use puppeteer to bypass cloudflare (headful because of captchas)
const openAIAuth = await getOpenAIAuth({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
})
const api = new ChatGPTAPI({ ...openAIAuth })
await api.initSession()
// send a message and wait for the response
const result = await api.sendMessage('Write a python version of bubble sort.')
// result.response is a markdown-formatted string
console.log(result.response)
}
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
completionParams: {
temperature: 0.5,
top_p: 0.8
}
})
```
</details>
ChatGPT responses are formatted as markdown by default. If you want to work with plaintext instead, you can use:
If you want to track the conversation, you'll need to pass the `parentMessageId` like this:
```ts
const api = new ChatGPTAPIBrowser({ email, password, markdown: false })
```
If you want to track the conversation, use the `conversationId` and `messageId` in the result object, and pass them to `sendMessage` as `conversationId` and `parentMessageId` respectively.
```ts
const api = new ChatGPTAPIBrowser({ email, password })
await api.initSession()
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
// send a message and wait for the response
let res = await api.sendMessage('What is OpenAI?')
console.log(res.response)
console.log(res.text)
// send a follow-up
res = await api.sendMessage('Can you expand on that?', {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
})
console.log(res.response)
console.log(res.text)
// send another follow-up
// send a follow-up
res = await api.sendMessage('What were we talking about?', {
conversationId: res.conversationId,
parentMessageId: res.messageId
parentMessageId: res.id
})
console.log(res.response)
console.log(res.text)
```
Sometimes, ChatGPT will hang for an extended period of time before beginning to respond. This may be due to rate limiting or it may be due to OpenAI's servers being overloaded.
You can add streaming via the `onProgress` handler:
To mitigate these issues, you can add a timeout like this:
```ts
const res = await api.sendMessage('Write a 500 word essay on frogs.', {
// print the partial response as the AI is "typing"
onProgress: (partialResponse) => console.log(partialResponse.text)
})
// print the full text at the end
console.log(res.text)
```
You can add a timeout using the `timeoutMs` option:
```ts
// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage('this is a timeout test', {
timeoutMs: 2 * 60 * 1000
const response = await api.sendMessage(
'write me a really really long essay on frogs',
{
timeoutMs: 2 * 60 * 1000
}
)
```
If you want to see more info about what's actually being sent to [OpenAI's chat completions API](https://platform.openai.com/docs/api-reference/chat/create), set the `debug: true` option in the `ChatGPTAPI` constructor:
```ts
const api = new ChatGPTAPI({
apiKey: process.env.OPENAI_API_KEY,
debug: true
})
```
We default to a basic `systemMessage`. You can override this in either the `ChatGPTAPI` constructor or `sendMessage`:
```ts
const res = await api.sendMessage('what is the answer to the universe?', {
systemMessage: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each responseIf you are generating a list, do not have too many items.
Current date: ${new Date().toISOString()}\n\n`
})
```
Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to `4096`).
<details>
<summary>Usage in CommonJS (Dynamic import)</summary>
```js
async function example() {
// To use ESM in CommonJS, you can use a dynamic import
const { ChatGPTAPI, getOpenAIAuth } = await import('chatgpt')
// To use ESM in CommonJS, you can use a dynamic import like this:
const { ChatGPTAPI } = await import('chatgpt')
// You can also try dynamic importing like this:
// const importDynamic = new Function('modulePath', 'return import(modulePath)')
// const { ChatGPTAPI } = await importDynamic('chatgpt')
const openAIAuth = await getOpenAIAuth({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD
})
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
const api = new ChatGPTAPI({ ...openAIAuth })
await api.initSession()
const result = await api.sendMessage('Hello World!')
console.log(result)
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
```
</details>
### Docs
### Usage - ChatGPTUnofficialProxyAPI
See the [auto-generated docs](./docs/classes/ChatGPTAPI.md) for more info on methods and parameters. Here are the [docs](./docs/classes/ChatGPTAPIBrowser.md) for the browser-based version.
The API for `ChatGPTUnofficialProxyAPI` is almost exactly the same. You just need to provide a ChatGPT `accessToken` instead of an OpenAI API key.
### Demos
```ts
import { ChatGPTUnofficialProxyAPI } from 'chatgpt'
async function example() {
const api = new ChatGPTUnofficialProxyAPI({
accessToken: process.env.OPENAI_ACCESS_TOKEN
})
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
```
See [demos/demo-reverse-proxy](./demos/demo-reverse-proxy.ts) for a full example:
```bash
npx tsx demos/demo-reverse-proxy.ts
```
`ChatGPTUnofficialProxyAPI` messages also contain a `conversationid` in addition to `parentMessageId`, since the ChatGPT webapp can't reference messages across different accounts & conversations.
#### Reverse Proxy
You can override the reverse proxy by passing `apiReverseProxyUrl`:
```ts
const api = new ChatGPTUnofficialProxyAPI({
accessToken: process.env.OPENAI_ACCESS_TOKEN,
apiReverseProxyUrl: 'https://your-example-server.com/api/conversation'
})
```
Known reverse proxies run by community members include:
| Reverse Proxy URL | Author | Rate Limits | Last Checked |
| ------------------------------------------------ | -------------------------------------------- | ----------------- | ------------ |
| `https://bypass.churchless.tech/api/conversation` | [@acheong08](https://github.com/acheong08) | 5 req / 10 seconds by IP | 3/24/2023 |
| `https://api.pawan.krd/backend-api/conversation` | [@PawanOsman](https://github.com/PawanOsman) | 50 req / 15 seconds (~3 r/s) | 3/23/2023 |
Note: info on how the reverse proxies work is not being published at this time in order to prevent OpenAI from disabling access.
#### Access Token
To use `ChatGPTUnofficialProxyAPI`, you'll need an OpenAI access token from the ChatGPT webapp. To do this, you can use any of the following methods which take an `email` and `password` and return an access token:
- Node.js libs
- [ericlewis/openai-authenticator](https://github.com/ericlewis/openai-authenticator)
- [michael-dm/openai-token](https://github.com/michael-dm/openai-token)
- [allanoricil/chat-gpt-authenticator](https://github.com/AllanOricil/chat-gpt-authenticator)
- Python libs
- [acheong08/OpenAIAuth](https://github.com/acheong08/OpenAIAuth)
These libraries work with email + password accounts (e.g., they do not support accounts where you auth via Microsoft / Google).
Alternatively, you can manually get an `accessToken` by logging in to the ChatGPT webapp and then opening `https://chat.openai.com/api/auth/session`, which will return a JSON object containing your `accessToken` string.
Access tokens last for days.
**Note**: using a reverse proxy will expose your access token to a third-party. There shouldn't be any adverse effects possible from this, but please consider the risks before using this method.
## Docs
See the [auto-generated docs](./docs/classes/ChatGPTAPI.md) for more info on methods and parameters.
## Demos
Most of the demos use `ChatGPTAPI`. It should be pretty easy to convert them to use `ChatGPTUnofficialProxyAPI` if you'd rather use that approach. The only thing that needs to change is how you initialize the api with an `accessToken` instead of an `apiKey`.
To run the included demos:
1. clone repo
2. install node deps
3. set `OPENAI_EMAIL` and `OPENAI_PASSWORD` in .env
3. set `OPENAI_API_KEY` in .env
A [basic demo](./demos/demo.ts) is included for testing purposes:
@ -197,105 +380,31 @@ A [basic demo](./demos/demo.ts) is included for testing purposes:
npx tsx demos/demo.ts
```
A [conversation demo](./demos/demo-conversation.ts) is also included:
A [demo showing on progress handler](./demos/demo-on-progress.ts):
```bash
npx tsx demos/demo-on-progress.ts
```
The on progress demo uses the optional `onProgress` parameter to `sendMessage` to receive intermediary results as ChatGPT is "typing".
A [conversation demo](./demos/demo-conversation.ts):
```bash
npx tsx demos/demo-conversation.ts
```
A [browser-based conversation demo](./demos/demo-conversation-browser.ts) is also included:
A [persistence demo](./demos/demo-persistence.ts) shows how to store messages in Redis for persistence:
```bash
npx tsx demos/demo-conversation-browser.ts
npx tsx demos/demo-persistence.ts
```
### Authentication
Any [keyv adaptor](https://github.com/jaredwray/keyv) is supported for persistence, and there are overrides if you'd like to use a different way of storing / retrieving messages.
The authentication section relates to the REST-based version (using `getOpenAIAuth` + `ChatGPTAPI`). The browser-based solution, `ChatGPTAPIBrowser`, takes care of all the authentication for you.
Note that persisting message is required for remembering the context of previous conversations beyond the scope of the current Node.js process, since by default, we only store messages in memory. Here's an [external demo](https://github.com/transitive-bullshit/chatgpt-twitter-bot/blob/main/src/index.ts#L86-L95) of using a completely custom database solution to persist messages.
On December 11, 2022, OpenAI added some additional Cloudflare protections which make it more difficult to access the unofficial API.
You'll need a valid OpenAI "session token" and Cloudflare "clearance token" in order to use the API.
We've provided an automated, Puppeteer-based solution `getOpenAIAuth` to fetch these for you, but you may still run into cases where you have to manually pass the CAPTCHA. We're working on a solution to automate this further.
You can also get these tokens manually, but keep in mind that the `clearanceToken` only lasts for max 2 hours.
<details>
<summary>Getting tokens manually</summary>
To get session token manually:
1. Go to https://chat.openai.com/chat and log in or sign up.
2. Open dev tools.
3. Open `Application` > `Cookies`.
![ChatGPT cookies](./media/session-token.png)
4. Copy the value for `__Secure-next-auth.session-token` and save it to your environment. This will be your `sessionToken`.
5. Copy the value for `cf_clearance` and save it to your environment. This will be your `clearanceToken`.
6. Copy the value of the `user-agent` header from any request in your `Network` tab. This will be your `userAgent`.
Pass `sessionToken`, `clearanceToken`, and `userAgent` to the `ChatGPTAPI` constructor.
</details>
> **Note**
> This package will switch to using the official API once it's released, which will make this process much simpler.
### CAPTCHAs
The browser portions of this package use Puppeteer to automate as much as possible, including solving all CAPTCHAs. 🔥
Basic Cloudflare CAPTCHAs are handled by default, but if you want to automate the email + password Recaptchas, you'll need to sign up for one of these paid providers:
- [nopecha](https://nopecha.com/) - Uses AI to solve CAPTCHAS
- Faster and cheaper
- Set the `NOPECHA_KEY` env var to your nopecha API key
- [Demo video](https://user-images.githubusercontent.com/552829/208235991-de4890f2-e7ba-4b42-bf55-4fcd792d4b19.mp4) of nopecha solving the login Recaptcha (41 seconds)
- [2captcha](https://2captcha.com) - Uses real people to solve CAPTCHAS
- More well-known solution that's been around longer
- Set the `CAPTCHA_TOKEN` env var to your 2captcha API token
Alternatively, if your OpenAI account uses Google Auth, you shouldn't encounter any of the more complicated Recaptchas — and can avoid using these third-party providers. To use Google auth, make sure your OpenAI account is using Google and then set `isGoogleLogin` to `true` whenever you're passing your `email` and `password`. For example:
```ts
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD,
isGoogleLogin: true
})
```
### Using Proxies
The browser implementation supports setting a proxy server. This is useful if you're running into rate limiting issues or if you want to use a proxy to hide your IP address.
To use a proxy, pass the `proxyServer` option to the `ChatGPTAPIBrowser` constructor, or simply set the `PROXY_SERVER` env var. For more information on the format, see [here](https://www.chromium.org/developers/design-documents/network-settings).
```ts
const api = new ChatGPTAPIBrowser({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD,
proxyServer: '<ip>:<port>'
})
```
You can also set the `PROXY_VALIDATE_IP` env var to your proxy's IP address. This will be used to validate that the proxy is working correctly, and will throw an error if it's not.
### Restrictions
These restrictions are for the `getOpenAIAuth` + `ChatGPTAPI` solution, which uses the unofficial API. The browser-based solution, `ChatGPTAPIBrowser`, generally doesn't have any of these restrictions.
**Please read carefully**
- You must use `node >= 18` at the moment. I'm using `v19.2.0` in my testing.
- Cloudflare `cf_clearance` **tokens expire after 2 hours**, so right now we recommend that you refresh your `cf_clearance` token every hour or so.
- Your `user-agent` and `IP address` **must match** from the real browser window you're logged in with to the one you're using for `ChatGPTAPI`.
- This means that you currently can't log in with your laptop and then run the bot on a server or proxy somewhere.
- Cloudflare will still sometimes ask you to complete a CAPTCHA, so you may need to keep an eye on it and manually resolve the CAPTCHA.
- You should not be using this account while the bot is using it, because that browser window may refresh one of your tokens and invalidate the bot's session.
> **Note**
> Prior to v1.0.0, this package used a headless browser via [Playwright](https://playwright.dev/) to automate the web UI. Here are the [docs for the initial browser version](https://github.com/transitive-bullshit/chatgpt-api/tree/v0.4.2).
**Note**: Persistence is handled automatically when using `ChatGPTUnofficialProxyAPI` because it is connecting indirectly to ChatGPT.
## Projects
@ -303,6 +412,8 @@ All of these awesome projects are built using the `chatgpt` package. 🤯
- [Twitter Bot](https://github.com/transitive-bullshit/chatgpt-twitter-bot) powered by ChatGPT ✨
- Mention [@ChatGPTBot](https://twitter.com/ChatGPTBot) on Twitter with your prompt to try it out
- [ChatGPT API Server](https://github.com/waylaidwanderer/node-chatgpt-api) - API server for this package with support for multiple OpenAI accounts, proxies, and load-balancing requests between accounts.
- [ChatGPT Prompts](https://github.com/pacholoamit/chatgpt-prompts) - A collection of 140+ of the best ChatGPT prompts from the community.
- [Lovelines.xyz](https://lovelines.xyz?ref=chatgpt-api)
- [Chrome Extension](https://github.com/gragland/chatgpt-everywhere) ([demo](https://twitter.com/gabe_ragland/status/1599466486422470656))
- [VSCode Extension #1](https://github.com/mpociot/chatgpt-vscode) ([demo](https://twitter.com/marcelpociot/status/1599180144551526400), [updated version](https://github.com/timkmecl/chatgpt-vscode), [marketplace](https://marketplace.visualstudio.com/items?itemName=timkmecl.chatgpt))
@ -314,18 +425,27 @@ All of these awesome projects are built using the `chatgpt` package. 🤯
- [Telegram Bot #1](https://github.com/realies/chatgpt-telegram-bot)
- [Telegram Bot #2](https://github.com/dawangraoming/chatgpt-telegram-bot)
- [Telegram Bot #3](https://github.com/RainEggplant/chatgpt-telegram-bot) (group privacy mode, ID-based auth)
- [Telegram Bot #4](https://github.com/ArdaGnsrn/chatgpt-telegram) (queue system, ID-based chat thread)
- [Telegram Bot #5](https://github.com/azoway/chatgpt-telegram-bot) (group privacy mode, ID-based chat thread)
- [Deno Telegram Bot](https://github.com/Ciyou/chatbot-telegram)
- [Go Telegram Bot](https://github.com/m1guelpf/chatgpt-telegram)
- [Telegram Bot for YouTube Summaries](https://github.com/codextde/youtube-summary)
- [GitHub ProBot](https://github.com/oceanlvr/ChatGPTBot)
- [Discord Bot #1](https://github.com/onury5506/Discord-ChatGPT-Bot)
- [Discord Bot #2](https://github.com/Nageld/ChatGPT-Bot)
- [Discord Bot #3](https://github.com/leinstay/gptbot)
- [Discord Bot #4 (selfbot)](https://github.com/0x7030676e31/cumsocket)
- [Discord Bot #5](https://github.com/itskdhere/ChatGPT-Discord-BOT)
- [Discord Bot #6 (Shakespeare bot)](https://gist.github.com/TheBrokenRail/4b37e7c44e8f721d8bd845050d034c16)
- [Discord Bot #7](https://github.com/Elitezen/discordjs-chatgpt)
- [Zoom Chat](https://github.com/shixin-guo/my-bot)
- [WeChat Bot #1](https://github.com/AutumnWhj/ChatGPT-wechat-bot)
- [WeChat Bot #2](https://github.com/fuergaosi233/wechat-chatgpt)
- [WeChat Bot #3](https://github.com/wangrongding/wechat-bot)
- [WeChat Bot #3](https://github.com/wangrongding/wechat-bot) (
- [WeChat Bot #4](https://github.com/darknightlab/wechat-bot)
- [WeChat Bot #5](https://github.com/sunshanpeng/wechaty-chatgpt)
- [WeChat Bot #6](https://github.com/formulahendry/chatgpt-wechat-bot)
- [WeChat Bot #7](https://github.com/gfl94/Chatbot004)
- [QQ Bot (plugin for Yunzai-bot)](https://github.com/ikechan8370/chatgpt-plugin)
- [QQ Bot (plugin for KiviBot)](https://github.com/KiviBotLab/kivibot-plugin-chatgpt)
- [QQ Bot (oicq)](https://github.com/easydu2002/chat_gpt_oicq)
@ -335,32 +455,57 @@ All of these awesome projects are built using the `chatgpt` package. 🤯
- [Flutter ChatGPT API](https://github.com/coskuncay/flutter_chatgpt_api)
- [Carik Bot](https://github.com/luridarmawan/Carik)
- [Github Action for reviewing PRs](https://github.com/kxxt/chatgpt-action/)
- [WhatsApp Bot #1](https://github.com/pascalroget/whatsgpt) (multi-user support)
- [WhatsApp Bot #1](https://github.com/askrella/whatsapp-chatgpt) (DALL-E + Whisper support 💪)
- [WhatsApp Bot #2](https://github.com/amosayomide05/chatgpt-whatsapp-bot)
- [WhatsApp Bot #3](https://github.com/navopw/whatsapp-chatgpt)
- [Matrix Bot](https://github.com/jakecoppinger/matrix-chatgpt-bot)
- [WhatsApp Bot #3](https://github.com/pascalroget/whatsgpt) (multi-user support)
- [WhatsApp Bot #4](https://github.com/noelzappy/chatgpt-whatsapp) (schedule periodic messages)
- [WhatsApp Bot #5](https://github.com/hujanais/bs-chat-gpt3-api) (RaspberryPi + ngrok + Twilio)
- [WhatsApp Bot #6](https://github.com/dannysantino/whatsgpt) (Session and chat history storage with MongoStore)
- [Matrix Bot](https://github.com/matrixgpt/matrix-chatgpt-bot)
- [Rental Cover Letter Generator](https://sharehouse.app/ai)
- [Assistant CLI](https://github.com/diciaup/assistant-cli)
- [Teams Bot](https://github.com/formulahendry/chatgpt-teams-bot)
- [Askai](https://github.com/yudax42/askai)
- [TalkGPT](https://github.com/ShadovvBeast/TalkGPT)
- [ChatGPT With Voice](https://github.com/thanhsonng/chatgpt-voice)
- [iOS Shortcut](https://github.com/leecobaby/shortcuts/blob/master/other/ChatGPT_EN.md)
- [Slack Bot](https://github.com/trietphm/chatgpt-slackbot/)
- [Slack Bot #1](https://github.com/trietphm/chatgpt-slackbot/)
- [Slack Bot #2](https://github.com/lokwkin/chatgpt-slackbot-node/) (with queueing mechanism)
- [Slack Bot #3](https://github.com/NessunKim/slack-chatgpt/)
- [Slack Bot #4](https://github.com/MarkusGalant/chatgpt-slackbot-serverless/) ( Serverless AWS Lambda )
- [Slack Bot #5](https://github.com/benjiJanssens/SlackGPT) (Hosted)
- [Add to Slack](https://slackgpt.benji.sh/slack/install)
- [Electron Bot](https://github.com/ShiranAbir/chaty)
- [Kodyfire CLI](https://github.com/nooqta/chatgpt-kodyfire)
- [Twitch Bot](https://github.com/BennyDeeDev/chatgpt-twitch-bot)
- [Continuous Conversation](https://github.com/DanielTerletzkiy/chat-gtp-assistant)
- [Figma plugin](https://github.com/frederickk/chatgpt-figma-plugin)
- [NestJS server](https://github.com/RusDyn/chatgpt_nestjs_server)
- [NestJS ChatGPT Starter Boilerplate](https://github.com/mitkodkn/nestjs-chatgpt-starter)
- [Wordsmith: Add-in for Microsoft Word](https://github.com/xtremehpx/Wordsmith)
- [QuizGPT: Create Kahoot quizzes with ChatGPT](https://github.com/Kladdy/quizgpt)
- [openai-chatgpt: Talk to ChatGPT from the terminal](https://github.com/gmpetrov/openai-chatgpt)
- [Clippy the Saleforce chatbot](https://github.com/sebas00/chatgptclippy) ClippyJS joke bot
- [ai-assistant](https://github.com/youking-lib/ai-assistant) Chat assistant
- [Feishu Bot](https://github.com/linjungz/feishu-chatgpt-bot)
- [DomainGPT: Discover available domain names](https://github.com/billylo1/DomainGPT)
- [AI Poem Generator](https://aipoemgenerator.com/)
- [Next.js ChatGPT With Firebase](https://github.com/youngle316/chatgpt)
- [ai-commit – GPT-3 Commit Message Generator](https://github.com/insulineru/ai-commit)
- [AItinerary – ChatGPT itinerary Generator](https://aitinerary.ai)
- [wechaty-chatgpt - A chatbot based on Wechaty & ChatGPT](https://github.com/zhengxs2018/wechaty-chatgpt)
- [Julius GPT](https://github.com/christophebe/julius-gpt) - Generate and publish your content from the CLI
- [OpenAI-API-Service](https://github.com/Jarvan-via/api-service) - Provides OpenAI related APIs for businesses
If you create a cool integration, feel free to open a PR and add it to the list.
## Compatibility
This package is ESM-only. It supports:
- Node.js >= 18
- Node.js 17, 16, and 14 were supported in earlier versions, but OpenAI's Cloudflare update caused a bug with `undici` on v17 and v16 that needs investigation. So for now, use `node >= 18`
- We recommend against using `chatgpt` from client-side browser code because it would expose your private session token
- This package is ESM-only.
- This package supports `node >= 14`.
- This module assumes that `fetch` is installed.
- In `node >= 18`, it's installed by default.
- In `node < 18`, you need to install a polyfill like `unfetch/polyfill` ([guide](https://github.com/developit/unfetch#usage-as-a-polyfill)) or `isomorphic-fetch` ([guide](https://github.com/matthew-andrews/isomorphic-fetch#readme)).
- If you want to build a website using `chatgpt`, we recommend using it only from your backend API
## Credits

Wyświetl plik

@ -1,70 +0,0 @@
import * as types from './types'
export abstract class AChatGPTAPI {
/**
* Performs any async initialization work required to ensure that this API is
* properly authenticated.
*
* @throws An error if the session failed to initialize properly.
*/
abstract initSession(): Promise<void>
/**
* Sends a message to ChatGPT, waits for the response to resolve, and returns
* the response.
*
* If you want to receive a stream of partial responses, use `opts.onProgress`.
*
* @param message - The prompt message to send
* @param opts.conversationId - Optional ID of a conversation to continue
* @param opts.parentMessageId - Optional ID of the previous message in the conversation
* @param opts.messageId - Optional ID of the message to send (defaults to a random UUID)
* @param opts.action - Optional ChatGPT `action` (either `next` or `variant`)
* @param opts.timeoutMs - Optional timeout in milliseconds (defaults to no timeout)
* @param opts.onProgress - Optional callback which will be invoked every time the partial response is updated
* @param opts.abortSignal - Optional callback used to abort the underlying `fetch` call using an [AbortController](https://developer.mozilla.org/en-US/docs/Web/API/AbortController)
*
* @returns The response from ChatGPT, including `conversationId`, `messageId`, and
* the `response` text.
*/
abstract sendMessage(
message: string,
opts?: types.SendMessageOptions
): Promise<types.ChatResponse>
/**
* @returns `true` if the client is authenticated with a valid session or `false`
* otherwise.
*/
abstract getIsAuthenticated(): Promise<boolean>
/**
* Refreshes the current ChatGPT session.
*
* Useful for bypassing 403 errors when Cloudflare clearance tokens expire.
*
* @returns Access credentials for the new session.
* @throws An error if it fails.
*/
abstract refreshSession(): Promise<any>
/**
* Closes the current ChatGPT session and starts a new one.
*
* Useful for bypassing 401 errors when sessions expire.
*
* @returns Access credentials for the new session.
* @throws An error if it fails.
*/
async resetSession(): Promise<any> {
await this.closeSession()
return this.initSession()
}
/**
* Closes the active session.
*
* @throws An error if it fails.
*/
abstract closeSession(): Promise<void>
}

Wyświetl plik

@ -1,699 +0,0 @@
import delay from 'delay'
import type { Browser, HTTPRequest, HTTPResponse, Page } from 'puppeteer'
import { v4 as uuidv4 } from 'uuid'
import * as types from './types'
import { AChatGPTAPI } from './abstract-chatgpt-api'
import { getBrowser, getOpenAIAuth } from './openai-auth'
import {
browserPostEventStream,
isRelevantRequest,
markdownToText,
maximizePage,
minimizePage
} from './utils'
const CHAT_PAGE_URL = 'https://chat.openai.com/chat'
export class ChatGPTAPIBrowser extends AChatGPTAPI {
protected _markdown: boolean
protected _debug: boolean
protected _minimize: boolean
protected _isGoogleLogin: boolean
protected _isMicrosoftLogin: boolean
protected _captchaToken: string
protected _nopechaKey: string
protected _accessToken: string
protected _email: string
protected _password: string
protected _executablePath: string
protected _browser: Browser
protected _page: Page
protected _proxyServer: string
protected _isRefreshing: boolean
/**
* Creates a new client for automating the ChatGPT webapp.
*/
constructor(opts: {
email: string
password: string
/** @defaultValue `true` **/
markdown?: boolean
/** @defaultValue `false` **/
debug?: boolean
/** @defaultValue `false` **/
isGoogleLogin?: boolean
/** @defaultValue `false` **/
isMicrosoftLogin?: boolean
/** @defaultValue `true` **/
minimize?: boolean
/** @defaultValue `undefined` **/
captchaToken?: string
/** @defaultValue `undefined` **/
nopechaKey?: string
/** @defaultValue `undefined` **/
executablePath?: string
/** @defaultValue `undefined` **/
proxyServer?: string
}) {
super()
const {
email,
password,
markdown = true,
debug = false,
isGoogleLogin = false,
isMicrosoftLogin = false,
minimize = true,
captchaToken,
nopechaKey,
executablePath,
proxyServer
} = opts
this._email = email
this._password = password
this._markdown = !!markdown
this._debug = !!debug
this._isGoogleLogin = !!isGoogleLogin
this._isMicrosoftLogin = !!isMicrosoftLogin
this._minimize = !!minimize
this._captchaToken = captchaToken
this._nopechaKey = nopechaKey
this._executablePath = executablePath
this._proxyServer = proxyServer
this._isRefreshing = false
if (!this._email) {
const error = new types.ChatGPTError('ChatGPT invalid email')
error.statusCode = 401
throw error
}
if (!this._password) {
const error = new types.ChatGPTError('ChatGPT invalid password')
error.statusCode = 401
throw error
}
}
override async initSession() {
if (this._browser) {
await this.closeSession()
}
try {
this._browser = await getBrowser({
captchaToken: this._captchaToken,
nopechaKey: this._nopechaKey,
executablePath: this._executablePath,
proxyServer: this._proxyServer,
minimize: this._minimize
})
this._page =
(await this._browser.pages())[0] || (await this._browser.newPage())
if (this._proxyServer && this._proxyServer.includes('@')) {
try {
const proxyUsername = this._proxyServer.split('@')[0].split(':')[0]
const proxyPassword = this._proxyServer.split('@')[0].split(':')[1]
await this._page.authenticate({
username: proxyUsername,
password: proxyPassword
})
} catch (err) {
console.error(
`Proxy "${this._proxyServer}" throws an error at authenticating`,
err.toString()
)
}
}
// bypass annoying popup modals
this._page.evaluateOnNewDocument(() => {
window.localStorage.setItem('oai/apps/hasSeenOnboarding/chat', 'true')
window.localStorage.setItem(
'oai/apps/hasSeenReleaseAnnouncement/2022-12-15',
'true'
)
})
// await maximizePage(this._page)
this._page.on('request', this._onRequest.bind(this))
this._page.on('response', this._onResponse.bind(this))
// bypass cloudflare and login
const authInfo = await getOpenAIAuth({
email: this._email,
password: this._password,
browser: this._browser,
page: this._page,
isGoogleLogin: this._isGoogleLogin,
isMicrosoftLogin: this._isMicrosoftLogin
})
if (this._debug) {
console.log('chatgpt', this._email, 'auth', authInfo)
}
} catch (err) {
if (this._browser) {
await this._browser.close()
}
this._browser = null
this._page = null
throw err
}
if (!this.isChatPage || this._isGoogleLogin || this._isMicrosoftLogin) {
await this._page.goto(CHAT_PAGE_URL, {
waitUntil: 'networkidle2'
})
}
// dismiss welcome modal (and other modals)
do {
const modalSelector = '[data-headlessui-state="open"]'
try {
if (!(await this._page.$(modalSelector))) {
break
}
await this._page.click(`${modalSelector} button:last-child`)
} catch (err) {
// "next" button not found in welcome modal
break
}
await delay(300)
} while (true)
if (!(await this.getIsAuthenticated())) {
throw new types.ChatGPTError('Failed to authenticate session')
}
if (this._minimize) {
return minimizePage(this._page)
}
}
_onRequest = (request: HTTPRequest) => {
const url = request.url()
if (!isRelevantRequest(url)) {
return
}
const method = request.method()
let body: any
if (method === 'POST') {
body = request.postData()
try {
body = JSON.parse(body)
} catch (_) {}
// if (url.endsWith('/conversation') && typeof body === 'object') {
// const conversationBody: types.ConversationJSONBody = body
// const conversationId = conversationBody.conversation_id
// const parentMessageId = conversationBody.parent_message_id
// const messageId = conversationBody.messages?.[0]?.id
// const prompt = conversationBody.messages?.[0]?.content?.parts?.[0]
// // TODO: store this info for the current sendMessage request
// }
}
if (this._debug) {
console.log('\nrequest', {
url,
method,
headers: request.headers(),
body
})
}
}
_onResponse = async (response: HTTPResponse) => {
const request = response.request()
const url = response.url()
if (!isRelevantRequest(url)) {
return
}
const status = response.status()
let body: any
try {
body = await response.json()
} catch (_) {}
if (this._debug) {
console.log('\nresponse', {
url,
ok: response.ok(),
status,
statusText: response.statusText(),
headers: response.headers(),
body,
request: {
method: request.method(),
headers: request.headers(),
body: request.postData()
}
})
}
if (url.endsWith('/conversation')) {
if (status === 403) {
console.log(`ChatGPT "${this._email}" error 403...`)
// this will be handled in the sendMessage error handler
// await this.refreshSession()
}
} else if (url.endsWith('api/auth/session')) {
if (status === 401) {
console.log(`ChatGPT "${this._email}" error 401...`)
// this will be handled in the sendMessage error handler
// await this.resetSession()
} else if (status === 403) {
console.log(`ChatGPT "${this._email}" error 403...`)
// this will be handled in the sendMessage error handler
// await this.refreshSession()
} else {
const session: types.SessionResult = body
if (session?.accessToken) {
this._accessToken = session.accessToken
}
}
}
}
/**
* Attempts to handle 401 errors by re-authenticating.
*/
async resetSession() {
console.log(`ChatGPT "${this._email}" resetSession...`)
try {
console.log('>>> closing session', this._email)
await this.closeSession()
console.log('<<< closing session', this._email)
await this.initSession()
console.log(`ChatGPT "${this._email}" refreshSession success`)
} catch (err) {
console.error(
`ChatGPT "${this._email}" resetSession error`,
err.toString()
)
}
}
/**
* Attempts to handle 403 errors by refreshing the page.
*/
async refreshSession() {
if (this._isRefreshing) {
return
}
this._isRefreshing = true
console.log(`ChatGPT "${this._email}" refreshSession...`)
try {
if (!this._minimize) {
await maximizePage(this._page)
}
await this._page.reload()
let response
const timeout = 120000 // 2 minutes in milliseconds
try {
// Wait for a response that includes the 'cf_clearance' cookie
response = await this._page.waitForResponse(
(response) => {
const cookie = response.headers()['set-cookie']
if (cookie?.includes('cf_clearance=')) {
const cfClearance = cookie
.split('cf_clearance=')?.[1]
?.split(';')?.[0]
// console.log('Cloudflare Cookie:', cfClearance)
return true
}
return false
},
{ timeout }
)
} catch (err) {
// Useful for when cloudflare cookie is still valid, to catch TimeoutError
response = !!(await this._getInputBox())
}
if (!response) {
throw new types.ChatGPTError('Could not fetch cf_clearance cookie')
}
if (this._minimize && this.isChatPage) {
await minimizePage(this._page)
}
console.log(`ChatGPT "${this._email}" refreshSession success`)
} catch (err) {
console.error(
`ChatGPT "${this._email}" error refreshing session`,
err.toString()
)
} finally {
this._isRefreshing = false
}
}
async getIsAuthenticated() {
try {
if (!this._accessToken) {
return false
}
const inputBox = await this._getInputBox()
return !!inputBox
} catch (err) {
// can happen when navigating during login
return false
}
}
// async getLastMessage(): Promise<string | null> {
// const messages = await this.getMessages()
// if (messages) {
// return messages[messages.length - 1]
// } else {
// return null
// }
// }
// async getPrompts(): Promise<string[]> {
// // Get all prompts
// const messages = await this._page.$$(
// '.text-base:has(.whitespace-pre-wrap):not(:has(button:nth-child(2))) .whitespace-pre-wrap'
// )
// // Prompts are always plaintext
// return Promise.all(messages.map((a) => a.evaluate((el) => el.textContent)))
// }
// async getMessages(): Promise<string[]> {
// // Get all complete messages
// // (in-progress messages that are being streamed back don't contain action buttons)
// const messages = await this._page.$$(
// '.text-base:has(.whitespace-pre-wrap):has(button:nth-child(2)) .whitespace-pre-wrap'
// )
// if (this._markdown) {
// const htmlMessages = await Promise.all(
// messages.map((a) => a.evaluate((el) => el.innerHTML))
// )
// const markdownMessages = htmlMessages.map((messageHtml) => {
// // parse markdown from message HTML
// messageHtml = messageHtml
// .replaceAll('Copy code</button>', '</button>')
// .replace(/Copy code\s*<\/button>/gim, '</button>')
// return html2md(messageHtml, {
// ignoreTags: [
// 'button',
// 'svg',
// 'style',
// 'form',
// 'noscript',
// 'script',
// 'meta',
// 'head'
// ],
// skipTags: ['button', 'svg']
// })
// })
// return markdownMessages
// } else {
// // plaintext
// const plaintextMessages = await Promise.all(
// messages.map((a) => a.evaluate((el) => el.textContent))
// )
// return plaintextMessages
// }
// }
override async sendMessage(
message: string,
opts: types.SendMessageOptions = {}
): Promise<types.ChatResponse> {
const {
conversationId,
parentMessageId = uuidv4(),
messageId = uuidv4(),
action = 'next',
timeoutMs
// TODO
// onProgress
} = opts
const url = `https://chat.openai.com/backend-api/conversation`
const body: types.ConversationJSONBody = {
action,
messages: [
{
id: messageId,
role: 'user',
content: {
content_type: 'text',
parts: [message]
}
}
],
model: 'text-davinci-002-render',
parent_message_id: parentMessageId
}
if (conversationId) {
body.conversation_id = conversationId
}
let result: types.ChatResponse | types.ChatError
let numTries = 0
let is401 = false
do {
if (is401 || !(await this.getIsAuthenticated())) {
console.log(`chatgpt re-authenticating ${this._email}`)
try {
await this.resetSession()
} catch (err) {
console.warn(
`chatgpt error re-authenticating ${this._email}`,
err.toString()
)
}
if (!(await this.getIsAuthenticated())) {
const error = new types.ChatGPTError('Not signed in')
error.statusCode = 401
throw error
}
}
try {
// console.log('>>> EVALUATE', url, this._accessToken, body)
result = await this._page.evaluate(
browserPostEventStream,
url,
this._accessToken,
body,
timeoutMs
)
} catch (err) {
// We catch all errors in `browserPostEventStream`, so this should really
// only happen if the page is refreshed or closed during its invocation.
// This may happen if we encounter a 401/403 and refresh the page in it's
// response handler or if the user has closed the page manually.
if (++numTries >= 2) {
const error = new types.ChatGPTError(err.toString())
error.statusCode = err.response?.statusCode
error.statusText = err.response?.statusText
throw error
}
console.warn('chatgpt sendMessage error; retrying...', err.toString())
await delay(5000)
continue
}
if ('error' in result) {
const error = new types.ChatGPTError(result.error.message)
error.statusCode = result.error.statusCode
error.statusText = result.error.statusText
++numTries
if (error.statusCode === 401) {
is401 = true
if (numTries >= 2) {
throw error
} else {
continue
}
} else if (error.statusCode !== 403) {
throw error
} else if (numTries >= 2) {
await this.refreshSession()
throw error
} else {
await this.refreshSession()
await delay(1000)
result = null
continue
}
} else {
if (!this._markdown) {
result.response = markdownToText(result.response)
}
return result
}
} while (!result)
// console.log('<<< EVALUATE', result)
// const lastMessage = await this.getLastMessage()
// await inputBox.focus()
// const paragraphs = message.split('\n')
// for (let i = 0; i < paragraphs.length; i++) {
// await inputBox.type(paragraphs[i], { delay: 0 })
// if (i < paragraphs.length - 1) {
// await this._page.keyboard.down('Shift')
// await inputBox.press('Enter')
// await this._page.keyboard.up('Shift')
// } else {
// await inputBox.press('Enter')
// }
// }
// const responseP = new Promise<string>(async (resolve, reject) => {
// try {
// do {
// await delay(1000)
// // TODO: this logic needs some work because we can have repeat messages...
// const newLastMessage = await this.getLastMessage()
// if (
// newLastMessage &&
// lastMessage?.toLowerCase() !== newLastMessage?.toLowerCase()
// ) {
// return resolve(newLastMessage)
// }
// } while (true)
// } catch (err) {
// return reject(err)
// }
// })
// if (timeoutMs) {
// return pTimeout(responseP, {
// milliseconds: timeoutMs
// })
// } else {
// return responseP
// }
}
async resetThread() {
try {
await this._page.click('nav > a:nth-child(1)')
} catch (err) {
// ignore for now
}
}
override async closeSession() {
try {
if (this._page) {
this._page.off('request', this._onRequest.bind(this))
this._page.off('response', this._onResponse.bind(this))
await this._page.deleteCookie({
name: 'cf_clearance',
domain: '.chat.openai.com'
})
// TODO; test this
// const client = await this._page.target().createCDPSession()
// await client.send('Network.clearBrowserCookies')
// await client.send('Network.clearBrowserCache')
await this._page.close()
}
} catch (err) {
console.warn('closeSession error', err)
}
if (this._browser) {
const pages = await this._browser.pages()
for (const page of pages) {
await page.close()
}
await this._browser.close()
// Rule number 1 of zombie process hunting: double-tap
if (this._browser.process()) {
this._browser.process().kill('SIGINT')
}
}
this._page = null
this._browser = null
this._accessToken = null
}
protected async _getInputBox() {
try {
return await this._page.$('textarea')
} catch (err) {
return null
}
}
get isChatPage(): boolean {
try {
const url = this._page?.url().replace(/\/$/, '')
return url === CHAT_PAGE_URL
} catch (err) {
return false
}
}
}

Wyświetl plik

@ -1,131 +0,0 @@
import test from 'ava'
import dotenv from 'dotenv-safe'
import * as types from './types'
import { ChatGPTAPI } from './chatgpt-api'
dotenv.config()
const isCI = !!process.env.CI
test('ChatGPTAPI invalid session token', async (t) => {
t.timeout(30 * 1000) // 30 seconds
t.throws(() => new ChatGPTAPI({ sessionToken: null, clearanceToken: null }), {
message: 'ChatGPT invalid session token'
})
await t.throwsAsync(
async () => {
const chatgpt = new ChatGPTAPI({
sessionToken: 'invalid',
clearanceToken: 'invalid'
})
await chatgpt.initSession()
},
{
instanceOf: types.ChatGPTError,
message: 'ChatGPT failed to refresh auth token. Error: Unauthorized'
}
)
})
test('ChatGPTAPI valid session token', async (t) => {
if (!isCI) {
t.timeout(2 * 60 * 1000) // 2 minutes
}
t.notThrows(
() =>
new ChatGPTAPI({
sessionToken: 'fake valid session token',
clearanceToken: 'invalid'
})
)
await t.notThrowsAsync(
(async () => {
const chatgpt = new ChatGPTAPI({
sessionToken: process.env.SESSION_TOKEN,
clearanceToken: process.env.CLEARANCE_TOKEN
})
// Don't make any real API calls using our session token if we're running on CI
if (!isCI) {
await chatgpt.initSession()
const response = await chatgpt.sendMessage('test')
console.log('chatgpt response', response)
t.truthy(response)
t.is(typeof response, 'string')
}
})()
)
})
if (!isCI) {
test('ChatGPTAPI expired session token', async (t) => {
t.timeout(30 * 1000) // 30 seconds
const expiredSessionToken = process.env.TEST_EXPIRED_SESSION_TOKEN
await t.throwsAsync(
async () => {
const chatgpt = new ChatGPTAPI({
sessionToken: expiredSessionToken,
clearanceToken: 'invalid'
})
await chatgpt.initSession()
},
{
instanceOf: types.ChatGPTError,
message:
'ChatGPT failed to refresh auth token. Error: session token may have expired'
}
)
})
}
if (!isCI) {
test('ChatGPTAPI timeout', async (t) => {
t.timeout(30 * 1000) // 30 seconds
await t.throwsAsync(
async () => {
const chatgpt = new ChatGPTAPI({
sessionToken: process.env.SESSION_TOKEN,
clearanceToken: process.env.CLEARANCE_TOKEN
})
await chatgpt.sendMessage('test', {
timeoutMs: 1
})
},
{
message: 'ChatGPT timed out waiting for response'
}
)
})
test('ChatGPTAPI abort', async (t) => {
t.timeout(30 * 1000) // 30 seconds
await t.throwsAsync(
async () => {
const chatgpt = new ChatGPTAPI({
sessionToken: process.env.SESSION_TOKEN,
clearanceToken: process.env.CLEARANCE_TOKEN
})
const abortController = new AbortController()
setTimeout(() => abortController.abort(new Error('testing abort')), 10)
await chatgpt.sendMessage('test', {
abortSignal: abortController.signal
})
},
{
message: 'testing abort'
}
)
})
}

Wyświetl plik

@ -1,193 +1,149 @@
import ExpiryMap from 'expiry-map'
import Keyv from 'keyv'
import pTimeout from 'p-timeout'
import QuickLRU from 'quick-lru'
import { v4 as uuidv4 } from 'uuid'
import * as tokenizer from './tokenizer'
import * as types from './types'
import { AChatGPTAPI } from './abstract-chatgpt-api'
import { fetch } from './fetch'
import { fetch as globalFetch } from './fetch'
import { fetchSSE } from './fetch-sse'
import { markdownToText } from './utils'
const KEY_ACCESS_TOKEN = 'accessToken'
const USER_AGENT =
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'
const CHATGPT_MODEL = 'gpt-3.5-turbo'
export class ChatGPTAPI extends AChatGPTAPI {
protected _sessionToken: string
protected _clearanceToken: string
protected _markdown: boolean
protected _debug: boolean
const USER_LABEL_DEFAULT = 'User'
const ASSISTANT_LABEL_DEFAULT = 'ChatGPT'
export class ChatGPTAPI {
protected _apiKey: string
protected _apiBaseUrl: string
protected _backendApiBaseUrl: string
protected _userAgent: string
protected _headers: Record<string, string>
protected _user: types.User | null = null
protected _apiOrg?: string
protected _debug: boolean
// Stores access tokens for `accessTokenTTL` milliseconds before needing to refresh
protected _accessTokenCache: ExpiryMap<string, string>
protected _systemMessage: string
protected _completionParams: Omit<
types.openai.CreateChatCompletionRequest,
'messages' | 'n'
>
protected _maxModelTokens: number
protected _maxResponseTokens: number
protected _fetch: types.FetchFn
protected _getMessageById: types.GetMessageByIdFunction
protected _upsertMessage: types.UpsertMessageFunction
protected _messageStore: Keyv<types.ChatMessage>
/**
* Creates a new client wrapper around the unofficial ChatGPT REST API.
* Creates a new client wrapper around OpenAI's chat completion API, mimicing the official ChatGPT webapp's functionality as closely as possible.
*
* Note that your IP address and `userAgent` must match the same values that you used
* to obtain your `clearanceToken`.
*
* @param opts.sessionToken = **Required** OpenAI session token which can be found in a valid session's cookies (see readme for instructions)
* @param opts.clearanceToken = **Required** Cloudflare `cf_clearance` cookie value (see readme for instructions)
* @param apiBaseUrl - Optional override; the base URL for ChatGPT webapp's API (`/api`)
* @param backendApiBaseUrl - Optional override; the base URL for the ChatGPT backend API (`/backend-api`)
* @param userAgent - Optional override; the `user-agent` header to use with ChatGPT requests
* @param accessTokenTTL - Optional override; how long in milliseconds access tokens should last before being forcefully refreshed
* @param accessToken - Optional default access token if you already have a valid one generated
* @param heaaders - Optional additional HTTP headers to be added to each `fetch` request
* @param debug - Optional enables logging debugging into to stdout
* @param apiKey - OpenAI API key (required).
* @param apiOrg - OpenAI API organization (optional).
* @param apiBaseUrl - Optional override for the OpenAI API base URL.
* @param debug - Optional enables logging debugging info to stdout.
* @param completionParams - Param overrides to send to the [OpenAI chat completion API](https://platform.openai.com/docs/api-reference/chat/create). Options like `temperature` and `presence_penalty` can be tweaked to change the personality of the assistant.
* @param maxModelTokens - Optional override for the maximum number of tokens allowed by the model's context. Defaults to 4096.
* @param maxResponseTokens - Optional override for the minimum number of tokens allowed for the model's response. Defaults to 1000.
* @param messageStore - Optional [Keyv](https://github.com/jaredwray/keyv) store to persist chat messages to. If not provided, messages will be lost when the process exits.
* @param getMessageById - Optional function to retrieve a message by its ID. If not provided, the default implementation will be used (using an in-memory `messageStore`).
* @param upsertMessage - Optional function to insert or update a message. If not provided, the default implementation will be used (using an in-memory `messageStore`).
* @param fetch - Optional override for the `fetch` implementation to use. Defaults to the global `fetch` function.
*/
constructor(opts: {
sessionToken: string
clearanceToken: string
/** @defaultValue `true` **/
markdown?: boolean
/** @defaultValue `'https://chat.openai.com/api'` **/
apiBaseUrl?: string
/** @defaultValue `'https://chat.openai.com/backend-api'` **/
backendApiBaseUrl?: string
/** @defaultValue `Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'` **/
userAgent?: string
/** @defaultValue 1 hour **/
accessTokenTTL?: number
/** @defaultValue `undefined` **/
accessToken?: string
/** @defaultValue `undefined` **/
headers?: Record<string, string>
/** @defaultValue `false` **/
debug?: boolean
}) {
super()
constructor(opts: types.ChatGPTAPIOptions) {
const {
sessionToken,
clearanceToken,
markdown = true,
apiBaseUrl = 'https://chat.openai.com/api',
backendApiBaseUrl = 'https://chat.openai.com/backend-api',
userAgent = USER_AGENT,
accessTokenTTL = 60 * 60000, // 1 hour
accessToken,
headers,
debug = false
apiKey,
apiOrg,
apiBaseUrl = 'https://api.openai.com/v1',
debug = false,
messageStore,
completionParams,
systemMessage,
maxModelTokens = 4000,
maxResponseTokens = 1000,
getMessageById,
upsertMessage,
fetch = globalFetch
} = opts
this._sessionToken = sessionToken
this._clearanceToken = clearanceToken
this._markdown = !!markdown
this._debug = !!debug
this._apiKey = apiKey
this._apiOrg = apiOrg
this._apiBaseUrl = apiBaseUrl
this._backendApiBaseUrl = backendApiBaseUrl
this._userAgent = userAgent
this._headers = {
'user-agent': this._userAgent,
'x-openai-assistant-app-id': '',
'accept-language': 'en-US,en;q=0.9',
'accept-encoding': 'gzip, deflate, br',
origin: 'https://chat.openai.com',
referer: 'https://chat.openai.com/chat',
'sec-ch-ua':
'"Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"',
'sec-ch-ua-platform': '"macOS"',
'sec-fetch-dest': 'empty',
'sec-fetch-mode': 'cors',
'sec-fetch-site': 'same-origin',
...headers
this._debug = !!debug
this._fetch = fetch
this._completionParams = {
model: CHATGPT_MODEL,
temperature: 0.8,
top_p: 1.0,
presence_penalty: 1.0,
...completionParams
}
this._accessTokenCache = new ExpiryMap<string, string>(accessTokenTTL)
if (accessToken) {
this._accessTokenCache.set(KEY_ACCESS_TOKEN, accessToken)
this._systemMessage = systemMessage
if (this._systemMessage === undefined) {
const currentDate = new Date().toISOString().split('T')[0]
this._systemMessage = `You are ChatGPT, a large language model trained by OpenAI. Answer as concisely as possible.\nKnowledge cutoff: 2021-09-01\nCurrent date: ${currentDate}`
}
if (!this._sessionToken) {
const error = new types.ChatGPTError('ChatGPT invalid session token')
error.statusCode = 401
throw error
this._maxModelTokens = maxModelTokens
this._maxResponseTokens = maxResponseTokens
this._getMessageById = getMessageById ?? this._defaultGetMessageById
this._upsertMessage = upsertMessage ?? this._defaultUpsertMessage
if (messageStore) {
this._messageStore = messageStore
} else {
this._messageStore = new Keyv<types.ChatMessage, any>({
store: new QuickLRU<string, types.ChatMessage>({ maxSize: 10000 })
})
}
if (!this._clearanceToken) {
const error = new types.ChatGPTError('ChatGPT invalid clearance token')
error.statusCode = 401
throw error
if (!this._apiKey) {
throw new Error('OpenAI missing required apiKey')
}
if (!this._fetch) {
throw new Error('Invalid environment; fetch is not defined')
}
if (typeof this._fetch !== 'function') {
throw new Error('Invalid "fetch" is not a function')
}
}
/**
* Gets the currently signed-in user, if authenticated, `null` otherwise.
*/
get user() {
return this._user
}
/** Gets the current session token. */
get sessionToken() {
return this._sessionToken
}
/** Gets the current Cloudflare clearance token (`cf_clearance` cookie value). */
get clearanceToken() {
return this._clearanceToken
}
/** Gets the current user agent. */
get userAgent() {
return this._userAgent
}
/**
* Refreshes the client's access token which will succeed only if the session
* is valid.
*/
override async initSession() {
await this.refreshSession()
}
/**
* Sends a message to ChatGPT, waits for the response to resolve, and returns
* the response.
* Sends a message to the OpenAI chat completions endpoint, waits for the response
* to resolve, and returns the response.
*
* If you want your response to have historical context, you must provide a valid `parentMessageId`.
*
* If you want to receive a stream of partial responses, use `opts.onProgress`.
* If you want to receive the full response, including message and conversation IDs,
* you can use `opts.onConversationResponse` or use the `ChatGPTAPI.getConversation`
* helper.
*
* Set `debug: true` in the `ChatGPTAPI` constructor to log more info on the full prompt sent to the OpenAI chat completions API. You can override the `systemMessage` in `opts` to customize the assistant's instructions.
*
* @param message - The prompt message to send
* @param opts.conversationId - Optional ID of a conversation to continue
* @param opts.parentMessageId - Optional ID of the previous message in the conversation
* @param opts.parentMessageId - Optional ID of the previous message in the conversation (defaults to `undefined`)
* @param opts.messageId - Optional ID of the message to send (defaults to a random UUID)
* @param opts.action - Optional ChatGPT `action` (either `next` or `variant`)
* @param opts.systemMessage - Optional override for the chat "system message" which acts as instructions to the model (defaults to the ChatGPT system message)
* @param opts.timeoutMs - Optional timeout in milliseconds (defaults to no timeout)
* @param opts.onProgress - Optional callback which will be invoked every time the partial response is updated
* @param opts.abortSignal - Optional callback used to abort the underlying `fetch` call using an [AbortController](https://developer.mozilla.org/en-US/docs/Web/API/AbortController)
* @param completionParams - Optional overrides to send to the [OpenAI chat completion API](https://platform.openai.com/docs/api-reference/chat/create). Options like `temperature` and `presence_penalty` can be tweaked to change the personality of the assistant.
*
* @returns The response from ChatGPT
*/
override async sendMessage(
message: string,
async sendMessage(
text: string,
opts: types.SendMessageOptions = {}
): Promise<types.ChatResponse> {
): Promise<types.ChatMessage> {
const {
conversationId,
parentMessageId = uuidv4(),
parentMessageId,
messageId = uuidv4(),
action = 'next',
timeoutMs,
onProgress
onProgress,
stream = onProgress ? true : false,
completionParams
} = opts
let { abortSignal } = opts
@ -198,109 +154,170 @@ export class ChatGPTAPI extends AChatGPTAPI {
abortSignal = abortController.signal
}
const accessToken = await this.refreshSession()
const message: types.ChatMessage = {
role: 'user',
id: messageId,
parentMessageId,
text
}
const body: types.ConversationJSONBody = {
action,
messages: [
{
id: messageId,
role: 'user',
content: {
content_type: 'text',
parts: [message]
}
const latestQuestion = message
const { messages, maxTokens, numTokens } = await this._buildMessages(
text,
opts
)
const result: types.ChatMessage = {
role: 'assistant',
id: uuidv4(),
parentMessageId: messageId,
text: ''
}
const responseP = new Promise<types.ChatMessage>(
async (resolve, reject) => {
const url = `${this._apiBaseUrl}/chat/completions`
const headers = {
'Content-Type': 'application/json',
Authorization: `Bearer ${this._apiKey}`
}
const body = {
max_tokens: maxTokens,
...this._completionParams,
...completionParams,
messages,
stream
}
],
model: 'text-davinci-002-render',
parent_message_id: parentMessageId
}
if (conversationId) {
body.conversation_id = conversationId
}
// Support multiple organizations
// See https://platform.openai.com/docs/api-reference/authentication
if (this._apiOrg) {
headers['OpenAI-Organization'] = this._apiOrg
}
const result: types.ChatResponse = {
conversationId,
messageId,
response: ''
}
if (this._debug) {
console.log(`sendMessage (${numTokens} tokens)`, body)
}
const responseP = new Promise<types.ChatResponse>((resolve, reject) => {
const url = `${this._backendApiBaseUrl}/conversation`
const headers = {
...this._headers,
Authorization: `Bearer ${accessToken}`,
Accept: 'text/event-stream',
'Content-Type': 'application/json',
Cookie: `cf_clearance=${this._clearanceToken}`
}
if (this._debug) {
console.log('POST', url, { body, headers })
}
fetchSSE(url, {
method: 'POST',
headers,
body: JSON.stringify(body),
signal: abortSignal,
onMessage: (data: string) => {
if (data === '[DONE]') {
return resolve(result)
}
try {
const convoResponseEvent: types.ConversationResponseEvent =
JSON.parse(data)
if (convoResponseEvent.conversation_id) {
result.conversationId = convoResponseEvent.conversation_id
}
if (convoResponseEvent.message?.id) {
result.messageId = convoResponseEvent.message.id
}
const message = convoResponseEvent.message
// console.log('event', JSON.stringify(convoResponseEvent, null, 2))
if (message) {
let text = message?.content?.parts?.[0]
if (text) {
if (!this._markdown) {
text = markdownToText(text)
if (stream) {
fetchSSE(
url,
{
method: 'POST',
headers,
body: JSON.stringify(body),
signal: abortSignal,
onMessage: (data: string) => {
if (data === '[DONE]') {
result.text = result.text.trim()
return resolve(result)
}
result.response = text
try {
const response: types.openai.CreateChatCompletionDeltaResponse =
JSON.parse(data)
if (onProgress) {
onProgress(result)
if (response.id) {
result.id = response.id
}
if (response.choices?.length) {
const delta = response.choices[0].delta
result.delta = delta.content
if (delta?.content) result.text += delta.content
if (delta.role) {
result.role = delta.role
}
result.detail = response
onProgress?.(result)
}
} catch (err) {
console.warn('OpenAI stream SEE event unexpected error', err)
return reject(err)
}
}
},
this._fetch
).catch(reject)
} else {
try {
const res = await this._fetch(url, {
method: 'POST',
headers,
body: JSON.stringify(body),
signal: abortSignal
})
if (!res.ok) {
const reason = await res.text()
const msg = `OpenAI error ${
res.status || res.statusText
}: ${reason}`
const error = new types.ChatGPTError(msg, { cause: res })
error.statusCode = res.status
error.statusText = res.statusText
return reject(error)
}
const response: types.openai.CreateChatCompletionResponse =
await res.json()
if (this._debug) {
console.log(response)
}
if (response?.id) {
result.id = response.id
}
if (response?.choices?.length) {
const message = response.choices[0].message
result.text = message.content
if (message.role) {
result.role = message.role
}
} else {
const res = response as any
return reject(
new Error(
`OpenAI error: ${
res?.detail?.message || res?.detail || 'unknown'
}`
)
)
}
result.detail = response
return resolve(result)
} catch (err) {
console.warn('fetchSSE onMessage unexpected error', err)
reject(err)
return reject(err)
}
}
}).catch((err) => {
const errMessageL = err.toString().toLowerCase()
if (
result.response &&
(errMessageL === 'error: typeerror: terminated' ||
errMessageL === 'typeerror: terminated')
) {
// OpenAI sometimes forcefully terminates the socket from their end before
// the HTTP request has resolved cleanly. In my testing, these cases tend to
// happen when OpenAI has already send the last `response`, so we can ignore
// the `fetch` error in this case.
return resolve(result)
} else {
return reject(err)
}
).then(async (message) => {
if (message.detail && !message.detail.usage) {
try {
const promptTokens = numTokens
const completionTokens = await this._getTokenCount(message.text)
message.detail.usage = {
prompt_tokens: promptTokens,
completion_tokens: completionTokens,
total_tokens: promptTokens + completionTokens,
estimated: true
}
} catch (err) {
// TODO: this should really never happen, but if it does,
// we should handle notify the user gracefully
}
})
}
return Promise.all([
this._upsertMessage(latestQuestion),
this._upsertMessage(message)
]).then(() => message)
})
if (timeoutMs) {
@ -314,160 +331,136 @@ export class ChatGPTAPI extends AChatGPTAPI {
return pTimeout(responseP, {
milliseconds: timeoutMs,
message: 'ChatGPT timed out waiting for response'
message: 'OpenAI timed out waiting for response'
})
} else {
return responseP
}
}
async sendModeration(input: string) {
const accessToken = await this.refreshSession()
const url = `${this._backendApiBaseUrl}/moderations`
const headers = {
...this._headers,
Authorization: `Bearer ${accessToken}`,
Accept: '*/*',
'Content-Type': 'application/json',
Cookie: `cf_clearance=${this._clearanceToken}`
get apiKey(): string {
return this._apiKey
}
set apiKey(apiKey: string) {
this._apiKey = apiKey
}
get apiOrg(): string {
return this._apiOrg
}
set apiOrg(apiOrg: string) {
this._apiOrg = apiOrg
}
protected async _buildMessages(text: string, opts: types.SendMessageOptions) {
const { systemMessage = this._systemMessage } = opts
let { parentMessageId } = opts
const userLabel = USER_LABEL_DEFAULT
const assistantLabel = ASSISTANT_LABEL_DEFAULT
const maxNumTokens = this._maxModelTokens - this._maxResponseTokens
let messages: types.openai.ChatCompletionRequestMessage[] = []
if (systemMessage) {
messages.push({
role: 'system',
content: systemMessage
})
}
const body: types.ModerationsJSONBody = {
input,
model: 'text-moderation-playground'
}
const systemMessageOffset = messages.length
let nextMessages = text
? messages.concat([
{
role: 'user',
content: text,
name: opts.name
}
])
: messages
let numTokens = 0
if (this._debug) {
console.log('POST', url, headers, body)
}
do {
const prompt = nextMessages
.reduce((prompt, message) => {
switch (message.role) {
case 'system':
return prompt.concat([`Instructions:\n${message.content}`])
case 'user':
return prompt.concat([`${userLabel}:\n${message.content}`])
default:
return prompt.concat([`${assistantLabel}:\n${message.content}`])
}
}, [] as string[])
.join('\n\n')
const res = await fetch(url, {
method: 'POST',
headers,
body: JSON.stringify(body)
}).then((r) => {
if (!r.ok) {
const error = new types.ChatGPTError(`${r.status} ${r.statusText}`)
error.response = r
error.statusCode = r.status
error.statusText = r.statusText
throw error
const nextNumTokensEstimate = await this._getTokenCount(prompt)
const isValidPrompt = nextNumTokensEstimate <= maxNumTokens
if (prompt && !isValidPrompt) {
break
}
return r.json() as any as types.ModerationsJSONResult
})
messages = nextMessages
numTokens = nextNumTokensEstimate
if (!isValidPrompt) {
break
}
if (!parentMessageId) {
break
}
const parentMessage = await this._getMessageById(parentMessageId)
if (!parentMessage) {
break
}
const parentMessageRole = parentMessage.role || 'user'
nextMessages = nextMessages.slice(0, systemMessageOffset).concat([
{
role: parentMessageRole,
content: parentMessage.text,
name: parentMessage.name
},
...nextMessages.slice(systemMessageOffset)
])
parentMessageId = parentMessage.parentMessageId
} while (true)
// Use up to 4096 tokens (prompt + response), but try to leave 1000 tokens
// for the response.
const maxTokens = Math.max(
1,
Math.min(this._maxModelTokens - numTokens, this._maxResponseTokens)
)
return { messages, maxTokens, numTokens }
}
protected async _getTokenCount(text: string) {
// TODO: use a better fix in the tokenizer
text = text.replace(/<\|endoftext\|>/g, '')
return tokenizer.encode(text).length
}
protected async _defaultGetMessageById(
id: string
): Promise<types.ChatMessage> {
const res = await this._messageStore.get(id)
return res
}
/**
* @returns `true` if the client has a valid acces token or `false` if refreshing
* the token fails.
*/
override async getIsAuthenticated() {
try {
void (await this.refreshSession())
return true
} catch (err) {
return false
}
}
/**
* Attempts to refresh the current access token using the ChatGPT
* `sessionToken` cookie.
*
* Access tokens will be cached for up to `accessTokenTTL` milliseconds to
* prevent refreshing access tokens too frequently.
*
* @returns A valid access token
* @throws An error if refreshing the access token fails.
*/
override async refreshSession(): Promise<string> {
const cachedAccessToken = this._accessTokenCache.get(KEY_ACCESS_TOKEN)
if (cachedAccessToken) {
return cachedAccessToken
}
let response: Response
try {
const url = `${this._apiBaseUrl}/auth/session`
const headers = {
...this._headers,
cookie: `cf_clearance=${this._clearanceToken}; __Secure-next-auth.session-token=${this._sessionToken}`,
accept: '*/*'
}
if (this._debug) {
console.log('GET', url, headers)
}
const res = await fetch(url, {
headers
}).then((r) => {
response = r
if (!r.ok) {
const error = new types.ChatGPTError(`${r.status} ${r.statusText}`)
error.response = r
error.statusCode = r.status
error.statusText = r.statusText
throw error
}
return r.json() as any as types.SessionResult
})
const accessToken = res?.accessToken
if (!accessToken) {
const error = new types.ChatGPTError('Unauthorized')
error.response = response
error.statusCode = response?.status
error.statusText = response?.statusText
throw error
}
const appError = res?.error
if (appError) {
if (appError === 'RefreshAccessTokenError') {
const error = new types.ChatGPTError('session token may have expired')
error.response = response
error.statusCode = response?.status
error.statusText = response?.statusText
throw error
} else {
const error = new types.ChatGPTError(appError)
error.response = response
error.statusCode = response?.status
error.statusText = response?.statusText
throw error
}
}
if (res.user) {
this._user = res.user
}
this._accessTokenCache.set(KEY_ACCESS_TOKEN, accessToken)
return accessToken
} catch (err: any) {
if (this._debug) {
console.error(err)
}
const error = new types.ChatGPTError(
`ChatGPT failed to refresh auth token. ${err.toString()}`
)
error.response = response
error.statusCode = response?.status
error.statusText = response?.statusText
error.originalError = err
throw error
}
}
override async closeSession(): Promise<void> {
this._accessTokenCache.delete(KEY_ACCESS_TOKEN)
protected async _defaultUpsertMessage(
message: types.ChatMessage
): Promise<void> {
await this._messageStore.set(message.id, message)
}
}

Wyświetl plik

@ -0,0 +1,263 @@
import pTimeout from 'p-timeout'
import { v4 as uuidv4 } from 'uuid'
import * as types from './types'
import { fetch as globalFetch } from './fetch'
import { fetchSSE } from './fetch-sse'
import { isValidUUIDv4 } from './utils'
export class ChatGPTUnofficialProxyAPI {
protected _accessToken: string
protected _apiReverseProxyUrl: string
protected _debug: boolean
protected _model: string
protected _headers: Record<string, string>
protected _fetch: types.FetchFn
/**
* @param fetch - Optional override for the `fetch` implementation to use. Defaults to the global `fetch` function.
*/
constructor(opts: {
accessToken: string
/** @defaultValue `https://bypass.duti.tech/api/conversation` **/
apiReverseProxyUrl?: string
/** @defaultValue `text-davinci-002-render-sha` **/
model?: string
/** @defaultValue `false` **/
debug?: boolean
/** @defaultValue `undefined` **/
headers?: Record<string, string>
fetch?: types.FetchFn
}) {
const {
accessToken,
apiReverseProxyUrl = 'https://bypass.duti.tech/api/conversation',
model = 'text-davinci-002-render-sha',
debug = false,
headers,
fetch = globalFetch
} = opts
this._accessToken = accessToken
this._apiReverseProxyUrl = apiReverseProxyUrl
this._debug = !!debug
this._model = model
this._fetch = fetch
this._headers = headers
if (!this._accessToken) {
throw new Error('ChatGPT invalid accessToken')
}
if (!this._fetch) {
throw new Error('Invalid environment; fetch is not defined')
}
if (typeof this._fetch !== 'function') {
throw new Error('Invalid "fetch" is not a function')
}
}
get accessToken(): string {
return this._accessToken
}
set accessToken(value: string) {
this._accessToken = value
}
/**
* Sends a message to ChatGPT, waits for the response to resolve, and returns
* the response.
*
* If you want your response to have historical context, you must provide a valid `parentMessageId`.
*
* If you want to receive a stream of partial responses, use `opts.onProgress`.
* If you want to receive the full response, including message and conversation IDs,
* you can use `opts.onConversationResponse` or use the `ChatGPTAPI.getConversation`
* helper.
*
* Set `debug: true` in the `ChatGPTAPI` constructor to log more info on the full prompt sent to the OpenAI completions API. You can override the `promptPrefix` and `promptSuffix` in `opts` to customize the prompt.
*
* @param message - The prompt message to send
* @param opts.conversationId - Optional ID of a conversation to continue (defaults to a random UUID)
* @param opts.parentMessageId - Optional ID of the previous message in the conversation (defaults to `undefined`)
* @param opts.messageId - Optional ID of the message to send (defaults to a random UUID)
* @param opts.timeoutMs - Optional timeout in milliseconds (defaults to no timeout)
* @param opts.onProgress - Optional callback which will be invoked every time the partial response is updated
* @param opts.abortSignal - Optional callback used to abort the underlying `fetch` call using an [AbortController](https://developer.mozilla.org/en-US/docs/Web/API/AbortController)
*
* @returns The response from ChatGPT
*/
async sendMessage(
text: string,
opts: types.SendMessageBrowserOptions = {}
): Promise<types.ChatMessage> {
if (!!opts.conversationId !== !!opts.parentMessageId) {
throw new Error(
'ChatGPTUnofficialProxyAPI.sendMessage: conversationId and parentMessageId must both be set or both be undefined'
)
}
if (opts.conversationId && !isValidUUIDv4(opts.conversationId)) {
throw new Error(
'ChatGPTUnofficialProxyAPI.sendMessage: conversationId is not a valid v4 UUID'
)
}
if (opts.parentMessageId && !isValidUUIDv4(opts.parentMessageId)) {
throw new Error(
'ChatGPTUnofficialProxyAPI.sendMessage: parentMessageId is not a valid v4 UUID'
)
}
if (opts.messageId && !isValidUUIDv4(opts.messageId)) {
throw new Error(
'ChatGPTUnofficialProxyAPI.sendMessage: messageId is not a valid v4 UUID'
)
}
const {
conversationId,
parentMessageId = uuidv4(),
messageId = uuidv4(),
action = 'next',
timeoutMs,
onProgress
} = opts
let { abortSignal } = opts
let abortController: AbortController = null
if (timeoutMs && !abortSignal) {
abortController = new AbortController()
abortSignal = abortController.signal
}
const body: types.ConversationJSONBody = {
action,
messages: [
{
id: messageId,
role: 'user',
content: {
content_type: 'text',
parts: [text]
}
}
],
model: this._model,
parent_message_id: parentMessageId
}
if (conversationId) {
body.conversation_id = conversationId
}
const result: types.ChatMessage = {
role: 'assistant',
id: uuidv4(),
parentMessageId: messageId,
conversationId,
text: ''
}
const responseP = new Promise<types.ChatMessage>((resolve, reject) => {
const url = this._apiReverseProxyUrl
const headers = {
...this._headers,
Authorization: `Bearer ${this._accessToken}`,
Accept: 'text/event-stream',
'Content-Type': 'application/json'
}
if (this._debug) {
console.log('POST', url, { body, headers })
}
fetchSSE(
url,
{
method: 'POST',
headers,
body: JSON.stringify(body),
signal: abortSignal,
onMessage: (data: string) => {
if (data === '[DONE]') {
return resolve(result)
}
try {
const convoResponseEvent: types.ConversationResponseEvent =
JSON.parse(data)
if (convoResponseEvent.conversation_id) {
result.conversationId = convoResponseEvent.conversation_id
}
if (convoResponseEvent.message?.id) {
result.id = convoResponseEvent.message.id
}
const message = convoResponseEvent.message
// console.log('event', JSON.stringify(convoResponseEvent, null, 2))
if (message) {
let text = message?.content?.parts?.[0]
if (text) {
result.text = text
if (onProgress) {
onProgress(result)
}
}
}
} catch (err) {
// ignore for now; there seem to be some non-json messages
// console.warn('fetchSSE onMessage unexpected error', err)
}
}
},
this._fetch
).catch((err) => {
const errMessageL = err.toString().toLowerCase()
if (
result.text &&
(errMessageL === 'error: typeerror: terminated' ||
errMessageL === 'typeerror: terminated')
) {
// OpenAI sometimes forcefully terminates the socket from their end before
// the HTTP request has resolved cleanly. In my testing, these cases tend to
// happen when OpenAI has already send the last `response`, so we can ignore
// the `fetch` error in this case.
return resolve(result)
} else {
return reject(err)
}
})
})
if (timeoutMs) {
if (abortController) {
// This will be called when a timeout occurs in order for us to forcibly
// ensure that the underlying HTTP request is aborted.
;(responseP as any).cancel = () => {
abortController.abort()
}
}
return pTimeout(responseP, {
milliseconds: timeoutMs,
message: 'ChatGPT timed out waiting for response'
})
} else {
return responseP
}
}
}

Wyświetl plik

@ -1,21 +1,29 @@
import { createParser } from 'eventsource-parser'
import * as types from './types'
import { fetch } from './fetch'
import { fetch as globalFetch } from './fetch'
import { streamAsyncIterable } from './stream-async-iterable'
export async function fetchSSE(
url: string,
options: Parameters<typeof fetch>[1] & { onMessage: (data: string) => void }
options: Parameters<typeof fetch>[1] & { onMessage: (data: string) => void },
fetch: types.FetchFn = globalFetch
) {
const { onMessage, ...fetchOptions } = options
const res = await fetch(url, fetchOptions)
if (!res.ok) {
const msg = `ChatGPTAPI error ${res.status || res.statusText}`
const error = new types.ChatGPTError(msg)
let reason: string
try {
reason = await res.text()
} catch (err) {
reason = res.statusText
}
const msg = `ChatGPT error ${res.status}: ${reason}`
const error = new types.ChatGPTError(msg, { cause: res })
error.statusCode = res.status
error.statusText = res.statusText
error.response = res
throw error
}

Wyświetl plik

@ -1,13 +1,5 @@
/// <reference lib="dom" />
// Use `fetch` for node.js >= 18
// Use `fetch` for all other environments, including browsers
const fetch = globalThis.fetch
if (typeof fetch !== 'function') {
throw new Error(
'Invalid environment: global fetch not defined; `chatgpt` requires Node.js >= 18 at the moment due to Cloudflare protections'
)
}
export { fetch }

Wyświetl plik

@ -1,6 +1,3 @@
export * from './chatgpt-api'
export * from './chatgpt-api-browser'
export * from './abstract-chatgpt-api'
export * from './chatgpt-unofficial-proxy-api'
export * from './types'
export * from './utils'
export * from './openai-auth'

Wyświetl plik

@ -1,646 +0,0 @@
import * as fs from 'node:fs'
import * as os from 'node:os'
import * as path from 'node:path'
import * as url from 'node:url'
import delay from 'delay'
import { TimeoutError } from 'p-timeout'
import { Browser, Page, Protocol, PuppeteerLaunchOptions } from 'puppeteer'
import puppeteer from 'puppeteer-extra'
import RecaptchaPlugin from 'puppeteer-extra-plugin-recaptcha'
import StealthPlugin from 'puppeteer-extra-plugin-stealth'
import random from 'random'
import * as types from './types'
import { minimizePage } from './utils'
puppeteer.use(StealthPlugin())
let hasRecaptchaPlugin = false
let hasNopechaExtension = false
const __dirname = url.fileURLToPath(new URL('.', import.meta.url))
/**
* Represents everything that's required to pass into `ChatGPTAPI` in order
* to authenticate with the unofficial ChatGPT API.
*/
export type OpenAIAuth = {
userAgent: string
clearanceToken: string
sessionToken: string
cookies?: Record<string, Protocol.Network.Cookie>
}
/**
* Bypasses OpenAI's use of Cloudflare to get the cookies required to use
* ChatGPT. Uses Puppeteer with a stealth plugin under the hood.
*
* If you pass `email` and `password`, then it will log into the account and
* include a `sessionToken` in the response.
*
* If you don't pass `email` and `password`, then it will just return a valid
* `clearanceToken`.
*
* This can be useful because `clearanceToken` expires after ~2 hours, whereas
* `sessionToken` generally lasts much longer. We recommend renewing your
* `clearanceToken` every hour or so and creating a new instance of `ChatGPTAPI`
* with your updated credentials.
*/
export async function getOpenAIAuth({
email,
password,
browser,
page,
timeoutMs = 3 * 60 * 1000,
isGoogleLogin = false,
isMicrosoftLogin = false,
captchaToken = process.env.CAPTCHA_TOKEN,
nopechaKey = process.env.NOPECHA_KEY,
executablePath,
proxyServer = process.env.PROXY_SERVER,
minimize = false
}: {
email?: string
password?: string
browser?: Browser
page?: Page
timeoutMs?: number
isGoogleLogin?: boolean
isMicrosoftLogin?: boolean
minimize?: boolean
captchaToken?: string
nopechaKey?: string
executablePath?: string
proxyServer?: string
}): Promise<OpenAIAuth> {
const origBrowser = browser
const origPage = page
try {
if (!browser) {
browser = await getBrowser({
captchaToken,
nopechaKey,
executablePath,
proxyServer
})
}
const userAgent = await browser.userAgent()
if (!page) {
page = (await browser.pages())[0] || (await browser.newPage())
page.setDefaultTimeout(timeoutMs)
if (minimize) {
await minimizePage(page)
}
}
await page.goto('https://chat.openai.com/auth/login', {
waitUntil: 'networkidle2'
})
// NOTE: this is where you may encounter a CAPTCHA
await checkForChatGPTAtCapacity(page, { timeoutMs })
if (hasRecaptchaPlugin) {
const captchas = await page.findRecaptchas()
if (captchas?.filtered?.length) {
console.log('solving captchas using 2captcha...')
const res = await page.solveRecaptchas()
console.log('captcha result', res)
}
}
// once we get to this point, the Cloudflare cookies should be available
// login as well (optional)
if (email && password) {
await waitForConditionOrAtCapacity(page, () =>
page.waitForSelector('#__next .btn-primary', { timeout: timeoutMs })
)
await delay(500)
// click login button and wait for navigation to finish
await Promise.all([
page.waitForNavigation({
waitUntil: 'networkidle2',
timeout: timeoutMs
}),
page.click('#__next .btn-primary')
])
await checkForChatGPTAtCapacity(page, { timeoutMs })
let submitP: () => Promise<void>
if (isGoogleLogin) {
await page.click('button[data-provider="google"]')
await page.waitForSelector('input[type="email"]')
await page.type('input[type="email"]', email, { delay: 10 })
await Promise.all([
page.waitForNavigation(),
await page.keyboard.press('Enter')
])
await page.waitForSelector('input[type="password"]', { visible: true })
await page.type('input[type="password"]', password, { delay: 10 })
submitP = () => page.keyboard.press('Enter')
} else if (isMicrosoftLogin) {
await page.click('button[data-provider="windowslive"]')
await page.waitForSelector('input[type="email"]')
await page.type('input[type="email"]', email, { delay: 10 })
await Promise.all([
page.waitForNavigation(),
await page.keyboard.press('Enter')
])
await delay(1500)
await page.waitForSelector('input[type="password"]', { visible: true })
await page.type('input[type="password"]', password, { delay: 10 })
submitP = () => page.keyboard.press('Enter')
await Promise.all([
page.waitForNavigation(),
await page.keyboard.press('Enter')
])
await delay(1000)
} else {
await page.waitForSelector('#username')
await page.type('#username', email)
await delay(100)
// NOTE: this is where you may encounter a CAPTCHA
if (hasNopechaExtension) {
await waitForRecaptcha(page, { timeoutMs })
} else if (hasRecaptchaPlugin) {
console.log('solving captchas using 2captcha...')
const res = await page.solveRecaptchas()
if (res.captchas?.length) {
console.log('captchas result', res)
} else {
console.log('no captchas found')
}
}
await delay(1200)
const frame = page.mainFrame()
const submit = await page.waitForSelector('button[type="submit"]', {
timeout: timeoutMs
})
frame.focus('button[type="submit"]')
await submit.focus()
await submit.click()
await page.waitForSelector('#password', { timeout: timeoutMs })
await page.type('#password', password, { delay: 10 })
submitP = () => page.click('button[type="submit"]')
}
await Promise.all([
waitForConditionOrAtCapacity(page, () =>
page.waitForNavigation({
waitUntil: 'networkidle2',
timeout: timeoutMs
})
),
submitP()
])
} else {
await delay(2000)
await checkForChatGPTAtCapacity(page, { timeoutMs })
}
const pageCookies = await page.cookies()
const cookies = pageCookies.reduce(
(map, cookie) => ({ ...map, [cookie.name]: cookie }),
{}
)
const authInfo: OpenAIAuth = {
userAgent,
clearanceToken: cookies['cf_clearance']?.value,
sessionToken: cookies['__Secure-next-auth.session-token']?.value,
cookies
}
return authInfo
} catch (err) {
throw err
} finally {
if (origBrowser) {
if (page && page !== origPage) {
await page.close()
}
} else if (browser) {
await browser.close()
}
page = null
browser = null
}
}
/**
* Launches a non-puppeteer instance of Chrome. Note that in my testing, I wasn't
* able to use the built-in `puppeteer` version of Chromium because Cloudflare
* recognizes it and blocks access.
*/
export async function getBrowser(
opts: PuppeteerLaunchOptions & {
captchaToken?: string
nopechaKey?: string
proxyServer?: string
minimize?: boolean
} = {}
) {
const {
captchaToken = process.env.CAPTCHA_TOKEN,
nopechaKey = process.env.NOPECHA_KEY,
executablePath = defaultChromeExecutablePath(),
proxyServer = process.env.PROXY_SERVER,
minimize = false,
...launchOptions
} = opts
if (captchaToken && !hasRecaptchaPlugin) {
hasRecaptchaPlugin = true
// console.log('use captcha', captchaToken)
puppeteer.use(
RecaptchaPlugin({
provider: {
id: '2captcha',
token: captchaToken
},
visualFeedback: true // colorize reCAPTCHAs (violet = detected, green = solved)
})
)
}
// https://peter.sh/experiments/chromium-command-line-switches/
const puppeteerArgs = [
'--no-sandbox',
'--disable-setuid-sandbox',
'--disable-infobars',
'--disable-dev-shm-usage',
'--disable-blink-features=AutomationControlled',
'--ignore-certificate-errors',
'--no-first-run',
'--no-service-autorun',
'--password-store=basic',
'--system-developer-mode',
// the following flags all try to reduce memory
// '--single-process',
'--mute-audio',
'--disable-default-apps',
'--no-zygote',
'--disable-accelerated-2d-canvas',
'--disable-web-security'
// '--disable-gpu'
// '--js-flags="--max-old-space-size=1024"'
]
if (nopechaKey) {
const nopechaPath = path.join(
__dirname,
'..',
'third-party',
'nopecha-chrome-extension'
)
puppeteerArgs.push(`--disable-extensions-except=${nopechaPath}`)
puppeteerArgs.push(`--load-extension=${nopechaPath}`)
hasNopechaExtension = true
}
if (proxyServer) {
const ipPort = proxyServer.includes('@')
? proxyServer.split('@')[1]
: proxyServer
puppeteerArgs.push(`--proxy-server=${ipPort}`)
}
const browser = await puppeteer.launch({
headless: false,
args: puppeteerArgs,
ignoreDefaultArgs: [
'--disable-extensions',
'--enable-automation',
'--disable-component-extensions-with-background-pages'
],
ignoreHTTPSErrors: true,
executablePath,
...launchOptions
})
if (process.env.PROXY_VALIDATE_IP) {
const page = (await browser.pages())[0] || (await browser.newPage())
if (minimize) {
await minimizePage(page)
}
// Send a fetch request to https://ifconfig.co using page.evaluate() and
// verify that the IP matches
let ip: string
try {
const res = await page.evaluate(() => {
return fetch('https://ifconfig.co', {
headers: {
Accept: 'application/json'
}
}).then((res) => res.json())
})
ip = res?.ip
} catch (err) {
throw new Error(`Proxy IP validation failed: ${err.toString()}`)
}
if (!ip || ip !== process.env.PROXY_VALIDATE_IP) {
throw new Error(
`Proxy IP mismatch: ${ip} !== ${process.env.PROXY_VALIDATE_IP}`
)
}
}
await initializeNopechaExtension(browser, {
minimize,
nopechaKey
})
return browser
}
export async function initializeNopechaExtension(
browser: Browser,
opts: {
minimize?: boolean
nopechaKey?: string
}
) {
const { minimize = false, nopechaKey } = opts
// TODO: this is a really hackity hack way of setting the API key...
if (hasNopechaExtension) {
const page = (await browser.pages())[0] || (await browser.newPage())
if (minimize) {
await minimizePage(page)
}
await page.goto(`https://nopecha.com/setup#${nopechaKey}`, {
waitUntil: 'networkidle0'
})
await delay(1000)
try {
// find the nopecha extension ID
const targets = browser.targets()
const extensionIds = (
await Promise.all(
targets.map(async (target) => {
// console.log(target.type(), target.url())
if (target.type() !== 'service_worker') {
return
}
// const titleL = title?.toLowerCase()
// if (titleL?.includes('nopecha'))
const url = new URL(target.url())
return url.hostname
})
)
).filter(Boolean)
const extensionId = extensionIds[0]
if (extensionId) {
const extensionUrl = `chrome-extension://${extensionId}/popup.html`
await page.goto(extensionUrl, { waitUntil: 'networkidle0' })
const editKey = await page.waitForSelector('#edit_key .clickable')
await editKey.click()
const settingsInput = await page.waitForSelector('input.settings_text')
const value = await settingsInput.evaluate((el) => el.value)
if (value !== nopechaKey) {
for (let i = 0; i <= 30; i++) {
await settingsInput.press('Backspace')
}
await settingsInput.type(nopechaKey)
await settingsInput.press('Enter')
await delay(500)
await editKey.click()
await delay(2000)
}
console.log('initialized nopecha extension with key', nopechaKey)
} else {
console.error(
"error initializing nopecha extension; couldn't determine extension ID"
)
}
} catch (err) {
console.error('error initializing nopecha extension', err)
}
}
}
/**
* Gets the default path to chrome's executable for the current platform.
*/
export const defaultChromeExecutablePath = (): string => {
// return executablePath()
if (process.env.PUPPETEER_EXECUTABLE_PATH) {
return process.env.PUPPETEER_EXECUTABLE_PATH
}
switch (os.platform()) {
case 'win32':
return 'C:\\Program Files\\Google\\Chrome\\Application\\chrome.exe'
case 'darwin':
return '/Applications/Google Chrome.app/Contents/MacOS/Google Chrome'
default: {
/**
* Since two (2) separate chrome releases exist on linux, we first do a
* check to ensure we're executing the right one.
*/
const chromeExists = fs.existsSync('/usr/bin/google-chrome')
return chromeExists
? '/usr/bin/google-chrome'
: '/usr/bin/google-chrome-stable'
}
}
}
async function checkForChatGPTAtCapacity(
page: Page,
opts: {
timeoutMs?: number
pollingIntervalMs?: number
retries?: number
} = {}
) {
const {
timeoutMs = 2 * 60 * 1000, // 2 minutes
pollingIntervalMs = 3000,
retries = 10
} = opts
// console.log('checkForChatGPTAtCapacity', page.url())
let isAtCapacity = false
let numTries = 0
do {
try {
await solveSimpleCaptchas(page)
const res = await page.$x("//div[contains(., 'ChatGPT is at capacity')]")
isAtCapacity = !!res?.length
if (isAtCapacity) {
if (++numTries >= retries) {
break
}
// try refreshing the page if chatgpt is at capacity
await page.reload({
waitUntil: 'networkidle2',
timeout: timeoutMs
})
await delay(pollingIntervalMs)
}
} catch (err) {
// ignore errors likely due to navigation
++numTries
break
}
} while (isAtCapacity)
if (isAtCapacity) {
const error = new types.ChatGPTError('ChatGPT is at capacity')
error.statusCode = 503
throw error
}
}
async function waitForConditionOrAtCapacity(
page: Page,
condition: () => Promise<any>,
opts: {
pollingIntervalMs?: number
} = {}
) {
const { pollingIntervalMs = 500 } = opts
return new Promise<void>((resolve, reject) => {
let resolved = false
async function waitForCapacityText() {
if (resolved) {
return
}
try {
await checkForChatGPTAtCapacity(page)
if (!resolved) {
setTimeout(waitForCapacityText, pollingIntervalMs)
}
} catch (err) {
if (!resolved) {
resolved = true
return reject(err)
}
}
}
condition()
.then(() => {
if (!resolved) {
resolved = true
resolve()
}
})
.catch((err) => {
if (!resolved) {
resolved = true
reject(err)
}
})
setTimeout(waitForCapacityText, pollingIntervalMs)
})
}
async function solveSimpleCaptchas(page: Page) {
try {
const verifyYouAreHuman = await page.$('text=Verify you are human')
if (verifyYouAreHuman) {
await delay(2000)
await verifyYouAreHuman.click({
delay: random.int(5, 25)
})
await delay(1000)
}
const cloudflareButton = await page.$('.hcaptcha-box')
if (cloudflareButton) {
await delay(2000)
await cloudflareButton.click({
delay: random.int(5, 25)
})
await delay(1000)
}
} catch (err) {
// ignore errors
}
}
async function waitForRecaptcha(
page: Page,
opts: {
pollingIntervalMs?: number
timeoutMs?: number
} = {}
) {
await solveSimpleCaptchas(page)
if (!hasNopechaExtension) {
return
}
const { pollingIntervalMs = 100, timeoutMs } = opts
const captcha = await page.$('textarea#g-recaptcha-response')
const startTime = Date.now()
if (captcha) {
console.log('waiting to solve recaptcha...')
do {
const captcha = await page.$('textarea#g-recaptcha-response')
if (!captcha) {
// the user may have gone past the page manually
break
}
const value = (await captcha.evaluate((el) => el.value))?.trim()
if (value?.length) {
// recaptcha has been solved!
break
}
if (timeoutMs) {
const now = Date.now()
if (now - startTime >= timeoutMs) {
throw new TimeoutError('Timed out waiting to solve Recaptcha')
}
}
await delay(pollingIntervalMs)
} while (true)
}
}

8
src/tokenizer.ts 100644
Wyświetl plik

@ -0,0 +1,8 @@
import { get_encoding } from '@dqbd/tiktoken'
// TODO: make this configurable
const tokenizer = get_encoding('cl100k_base')
export function encode(input: string): Uint32Array {
return tokenizer.encode(input)
}

Wyświetl plik

@ -1,131 +1,103 @@
export type ContentType = 'text'
import Keyv from 'keyv'
export type Role = 'user' | 'assistant'
export type Role = 'user' | 'assistant' | 'system'
/**
* https://chat.openapi.com/api/auth/session
*/
export type SessionResult = {
/**
* Authenticated user
*/
user: User
export type FetchFn = typeof fetch
/**
* ISO date of the expiration date of the access token
*/
expires: string
export type ChatGPTAPIOptions = {
apiKey: string
/**
* The access token
*/
accessToken: string
/** @defaultValue `'https://api.openai.com'` **/
apiBaseUrl?: string
/**
* If there was an error associated with this request
*/
error?: string | null
apiOrg: string
/** @defaultValue `false` **/
debug?: boolean
completionParams?: Partial<
Omit<openai.CreateChatCompletionRequest, 'messages' | 'n' | 'stream'>
>
systemMessage?: string
/** @defaultValue `4096` **/
maxModelTokens?: number
/** @defaultValue `1000` **/
maxResponseTokens?: number
messageStore?: Keyv
getMessageById?: GetMessageByIdFunction
upsertMessage?: UpsertMessageFunction
fetch?: FetchFn
}
export type User = {
/**
* ID of the user
*/
export type SendMessageOptions = {
/** The name of a user in a multi-user chat. */
name?: string
parentMessageId?: string
messageId?: string
stream?: boolean
systemMessage?: string
timeoutMs?: number
onProgress?: (partialResponse: ChatMessage) => void
abortSignal?: AbortSignal
completionParams?: Partial<
Omit<openai.CreateChatCompletionRequest, 'messages' | 'n' | 'stream'>
>
}
export type MessageActionType = 'next' | 'variant'
export type SendMessageBrowserOptions = {
conversationId?: string
parentMessageId?: string
messageId?: string
action?: MessageActionType
timeoutMs?: number
onProgress?: (partialResponse: ChatMessage) => void
abortSignal?: AbortSignal
}
export interface ChatMessage {
id: string
text: string
role: Role
name?: string
delta?: string
detail?:
| openai.CreateChatCompletionResponse
| CreateChatCompletionStreamResponse
/**
* Name of the user
*/
name: string
/**
* Email of the user
*/
email?: string
/**
* Image of the user
*/
image: string
/**
* Picture of the user
*/
picture: string
/**
* Groups the user is in
*/
groups: string[]
/**
* Features the user is in
*/
features: string[]
// relevant for both ChatGPTAPI and ChatGPTUnofficialProxyAPI
parentMessageId?: string
// only relevant for ChatGPTUnofficialProxyAPI
conversationId?: string
}
/**
* https://chat.openapi.com/backend-api/models
*/
export type ModelsResult = {
/**
* Array of models
*/
models: Model[]
export class ChatGPTError extends Error {
statusCode?: number
statusText?: string
isFinal?: boolean
accountId?: string
}
export type Model = {
/**
* Name of the model
*/
slug: string
/** Returns a chat message from a store by it's ID (or null if not found). */
export type GetMessageByIdFunction = (id: string) => Promise<ChatMessage>
/**
* Max tokens of the model
*/
max_tokens: number
/** Upserts a chat message to a store. */
export type UpsertMessageFunction = (message: ChatMessage) => Promise<void>
/**
* Whether or not the model is special
*/
is_special: boolean
export interface CreateChatCompletionStreamResponse
extends openai.CreateChatCompletionDeltaResponse {
usage: CreateCompletionStreamResponseUsage
}
/**
* https://chat.openapi.com/backend-api/moderations
*/
export type ModerationsJSONBody = {
/**
* Input for the moderation decision
*/
input: string
/**
* The model to use in the decision
*/
model: AvailableModerationModels
}
export type AvailableModerationModels = 'text-moderation-playground'
/**
* https://chat.openapi.com/backend-api/moderations
*/
export type ModerationsJSONResult = {
/**
* Whether or not the input is flagged
*/
flagged: boolean
/**
* Whether or not the input is blocked
*/
blocked: boolean
/**
* The ID of the decision
*/
moderation_id: string
export interface CreateCompletionStreamResponseUsage
extends openai.CreateCompletionResponseUsage {
estimated: true
}
/**
@ -175,6 +147,8 @@ export type Prompt = {
role: Role
}
export type ContentType = 'text'
export type PromptContent = {
/**
* The content type of the prompt
@ -187,67 +161,6 @@ export type PromptContent = {
parts: string[]
}
/**
* https://chat.openapi.com/backend-api/conversation/message_feedback
*/
export type MessageFeedbackJSONBody = {
/**
* The ID of the conversation
*/
conversation_id: string
/**
* The message ID
*/
message_id: string
/**
* The rating
*/
rating: MessageFeedbackRating
/**
* Tags to give the rating
*/
tags?: MessageFeedbackTags[]
/**
* The text to include
*/
text?: string
}
export type MessageFeedbackTags = 'harmful' | 'false' | 'not-helpful'
export type MessageFeedbackResult = {
/**
* The message ID
*/
message_id: string
/**
* The ID of the conversation
*/
conversation_id: string
/**
* The ID of the user
*/
user_id: string
/**
* The rating
*/
rating: MessageFeedbackRating
/**
* The text the server received, including tags
*/
text?: string
}
export type MessageFeedbackRating = 'thumbsUp' | 'thumbsDown'
export type ConversationResponseEvent = {
message?: Message
conversation_id?: string
@ -257,7 +170,7 @@ export type ConversationResponseEvent = {
export type Message = {
id: string
content: MessageContent
role: string
role: Role
user: string | null
create_time: string | null
update_time: string | null
@ -273,38 +186,259 @@ export type MessageContent = {
}
export type MessageMetadata = any
export type MessageActionType = 'next' | 'variant'
export type SendMessageOptions = {
conversationId?: string
parentMessageId?: string
messageId?: string
action?: MessageActionType
timeoutMs?: number
onProgress?: (partialResponse: ChatResponse) => void
abortSignal?: AbortSignal
}
export type SendConversationMessageOptions = Omit<
SendMessageOptions,
'conversationId' | 'parentMessageId'
>
export class ChatGPTError extends Error {
statusCode?: number
statusText?: string
response?: Response
originalError?: Error
}
export type ChatError = {
error: { message: string; statusCode?: number; statusText?: string }
conversationId?: string
messageId?: string
}
export type ChatResponse = {
response: string
conversationId: string
messageId: string
export namespace openai {
export interface CreateChatCompletionDeltaResponse {
id: string
object: 'chat.completion.chunk'
created: number
model: string
choices: [
{
delta: {
role: Role
content?: string
}
index: number
finish_reason: string | null
}
]
}
/**
*
* @export
* @interface ChatCompletionRequestMessage
*/
export interface ChatCompletionRequestMessage {
/**
* The role of the author of this message.
* @type {string}
* @memberof ChatCompletionRequestMessage
*/
role: ChatCompletionRequestMessageRoleEnum
/**
* The contents of the message
* @type {string}
* @memberof ChatCompletionRequestMessage
*/
content: string
/**
* The name of the user in a multi-user chat
* @type {string}
* @memberof ChatCompletionRequestMessage
*/
name?: string
}
export declare const ChatCompletionRequestMessageRoleEnum: {
readonly System: 'system'
readonly User: 'user'
readonly Assistant: 'assistant'
}
export declare type ChatCompletionRequestMessageRoleEnum =
(typeof ChatCompletionRequestMessageRoleEnum)[keyof typeof ChatCompletionRequestMessageRoleEnum]
/**
*
* @export
* @interface ChatCompletionResponseMessage
*/
export interface ChatCompletionResponseMessage {
/**
* The role of the author of this message.
* @type {string}
* @memberof ChatCompletionResponseMessage
*/
role: ChatCompletionResponseMessageRoleEnum
/**
* The contents of the message
* @type {string}
* @memberof ChatCompletionResponseMessage
*/
content: string
}
export declare const ChatCompletionResponseMessageRoleEnum: {
readonly System: 'system'
readonly User: 'user'
readonly Assistant: 'assistant'
}
export declare type ChatCompletionResponseMessageRoleEnum =
(typeof ChatCompletionResponseMessageRoleEnum)[keyof typeof ChatCompletionResponseMessageRoleEnum]
/**
*
* @export
* @interface CreateChatCompletionRequest
*/
export interface CreateChatCompletionRequest {
/**
* ID of the model to use. Currently, only `gpt-3.5-turbo` and `gpt-3.5-turbo-0301` are supported.
* @type {string}
* @memberof CreateChatCompletionRequest
*/
model: string
/**
* The messages to generate chat completions for, in the [chat format](/docs/guides/chat/introduction).
* @type {Array<ChatCompletionRequestMessage>}
* @memberof CreateChatCompletionRequest
*/
messages: Array<ChatCompletionRequestMessage>
/**
* What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or `top_p` but not both.
* @type {number}
* @memberof CreateChatCompletionRequest
*/
temperature?: number | null
/**
* An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or `temperature` but not both.
* @type {number}
* @memberof CreateChatCompletionRequest
*/
top_p?: number | null
/**
* How many chat completion choices to generate for each input message.
* @type {number}
* @memberof CreateChatCompletionRequest
*/
n?: number | null
/**
* If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only [server-sent events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events#Event_stream_format) as they become available, with the stream terminated by a `data: [DONE]` message.
* @type {boolean}
* @memberof CreateChatCompletionRequest
*/
stream?: boolean | null
/**
*
* @type {CreateChatCompletionRequestStop}
* @memberof CreateChatCompletionRequest
*/
stop?: CreateChatCompletionRequestStop
/**
* The maximum number of tokens allowed for the generated answer. By default, the number of tokens the model can return will be (4096 - prompt tokens).
* @type {number}
* @memberof CreateChatCompletionRequest
*/
max_tokens?: number
/**
* Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model\'s likelihood to talk about new topics. [See more information about frequency and presence penalties.](/docs/api-reference/parameter-details)
* @type {number}
* @memberof CreateChatCompletionRequest
*/
presence_penalty?: number | null
/**
* Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model\'s likelihood to repeat the same line verbatim. [See more information about frequency and presence penalties.](/docs/api-reference/parameter-details)
* @type {number}
* @memberof CreateChatCompletionRequest
*/
frequency_penalty?: number | null
/**
* Modify the likelihood of specified tokens appearing in the completion. Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
* @type {object}
* @memberof CreateChatCompletionRequest
*/
logit_bias?: object | null
/**
* A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. [Learn more](/docs/guides/safety-best-practices/end-user-ids).
* @type {string}
* @memberof CreateChatCompletionRequest
*/
user?: string
}
/**
* @type CreateChatCompletionRequestStop
* Up to 4 sequences where the API will stop generating further tokens.
* @export
*/
export declare type CreateChatCompletionRequestStop = Array<string> | string
/**
*
* @export
* @interface CreateChatCompletionResponse
*/
export interface CreateChatCompletionResponse {
/**
*
* @type {string}
* @memberof CreateChatCompletionResponse
*/
id: string
/**
*
* @type {string}
* @memberof CreateChatCompletionResponse
*/
object: string
/**
*
* @type {number}
* @memberof CreateChatCompletionResponse
*/
created: number
/**
*
* @type {string}
* @memberof CreateChatCompletionResponse
*/
model: string
/**
*
* @type {Array<CreateChatCompletionResponseChoicesInner>}
* @memberof CreateChatCompletionResponse
*/
choices: Array<CreateChatCompletionResponseChoicesInner>
/**
*
* @type {CreateCompletionResponseUsage}
* @memberof CreateChatCompletionResponse
*/
usage?: CreateCompletionResponseUsage
}
/**
*
* @export
* @interface CreateChatCompletionResponseChoicesInner
*/
export interface CreateChatCompletionResponseChoicesInner {
/**
*
* @type {number}
* @memberof CreateChatCompletionResponseChoicesInner
*/
index?: number
/**
*
* @type {ChatCompletionResponseMessage}
* @memberof CreateChatCompletionResponseChoicesInner
*/
message?: ChatCompletionResponseMessage
/**
*
* @type {string}
* @memberof CreateChatCompletionResponseChoicesInner
*/
finish_reason?: string
}
/**
*
* @export
* @interface CreateCompletionResponseUsage
*/
export interface CreateCompletionResponseUsage {
/**
*
* @type {number}
* @memberof CreateCompletionResponseUsage
*/
prompt_tokens: number
/**
*
* @type {number}
* @memberof CreateCompletionResponseUsage
*/
completion_tokens: number
/**
*
* @type {number}
* @memberof CreateCompletionResponseUsage
*/
total_tokens: number
}
}

Wyświetl plik

@ -1,521 +1,6 @@
import type * as PTimeoutTypes from 'p-timeout'
import type {
EventSourceParseCallback,
EventSourceParser
} from 'eventsource-parser'
import type { Page } from 'puppeteer'
import { remark } from 'remark'
import stripMarkdown from 'strip-markdown'
const uuidv4Re =
/^[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i
import * as types from './types'
export function markdownToText(markdown?: string): string {
return remark()
.use(stripMarkdown)
.processSync(markdown ?? '')
.toString()
}
export async function minimizePage(page: Page) {
const session = await page.target().createCDPSession()
const goods = await session.send('Browser.getWindowForTarget')
const { windowId } = goods
await session.send('Browser.setWindowBounds', {
windowId,
bounds: { windowState: 'minimized' }
})
}
export async function maximizePage(page: Page) {
const session = await page.target().createCDPSession()
const goods = await session.send('Browser.getWindowForTarget')
const { windowId } = goods
await session.send('Browser.setWindowBounds', {
windowId,
bounds: { windowState: 'normal' }
})
}
export function isRelevantRequest(url: string): boolean {
let pathname: string
try {
const parsedUrl = new URL(url)
pathname = parsedUrl.pathname
url = parsedUrl.toString()
} catch (_) {
return false
}
if (!url.startsWith('https://chat.openai.com')) {
return false
}
if (
!pathname.startsWith('/backend-api/') &&
!pathname.startsWith('/api/auth/session')
) {
return false
}
if (pathname.endsWith('backend-api/moderations')) {
return false
}
return true
}
/**
* This function is injected into the ChatGPT webapp page using puppeteer. It
* has to be fully self-contained, so we copied a few third-party sources and
* included them in here.
*/
export async function browserPostEventStream(
url: string,
accessToken: string,
body: types.ConversationJSONBody,
timeoutMs?: number
): Promise<types.ChatError | types.ChatResponse> {
// Workaround for https://github.com/esbuild-kit/tsx/issues/113
globalThis.__name = () => undefined
class TimeoutError extends Error {
readonly name: 'TimeoutError'
constructor(message) {
super(message)
this.name = 'TimeoutError'
}
}
/**
An error to be thrown when the request is aborted by AbortController.
DOMException is thrown instead of this Error when DOMException is available.
*/
class AbortError extends Error {
constructor(message) {
super()
this.name = 'AbortError'
this.message = message
}
}
const BOM = [239, 187, 191]
let conversationId: string = body?.conversation_id
let messageId: string = body?.messages?.[0]?.id
let response = ''
try {
console.log('browserPostEventStream', url, accessToken, body)
let abortController: AbortController = null
if (timeoutMs) {
abortController = new AbortController()
}
const res = await fetch(url, {
method: 'POST',
body: JSON.stringify(body),
signal: abortController?.signal,
headers: {
accept: 'text/event-stream',
'x-openai-assistant-app-id': '',
authorization: `Bearer ${accessToken}`,
'content-type': 'application/json'
}
})
console.log('browserPostEventStream response', res)
if (!res.ok) {
return {
error: {
message: `ChatGPTAPI error ${res.status || res.statusText}`,
statusCode: res.status,
statusText: res.statusText
},
conversationId,
messageId
}
}
const responseP = new Promise<types.ChatResponse>(
async (resolve, reject) => {
function onMessage(data: string) {
if (data === '[DONE]') {
return resolve({
response,
conversationId,
messageId
})
}
try {
const convoResponseEvent: types.ConversationResponseEvent =
JSON.parse(data)
if (convoResponseEvent.conversation_id) {
conversationId = convoResponseEvent.conversation_id
}
if (convoResponseEvent.message?.id) {
messageId = convoResponseEvent.message.id
}
const partialResponse =
convoResponseEvent.message?.content?.parts?.[0]
if (partialResponse) {
response = partialResponse
}
} catch (err) {
console.warn('fetchSSE onMessage unexpected error', err)
reject(err)
}
}
const parser = createParser((event) => {
if (event.type === 'event') {
onMessage(event.data)
}
})
for await (const chunk of streamAsyncIterable(res.body)) {
const str = new TextDecoder().decode(chunk)
parser.feed(str)
}
}
)
if (timeoutMs) {
if (abortController) {
// This will be called when a timeout occurs in order for us to forcibly
// ensure that the underlying HTTP request is aborted.
;(responseP as any).cancel = () => {
abortController.abort()
}
}
return await pTimeout(responseP, {
milliseconds: timeoutMs,
message: 'ChatGPT timed out waiting for response'
})
} else {
return await responseP
}
} catch (err) {
const errMessageL = err.toString().toLowerCase()
if (
response &&
(errMessageL === 'error: typeerror: terminated' ||
errMessageL === 'typeerror: terminated')
) {
// OpenAI sometimes forcefully terminates the socket from their end before
// the HTTP request has resolved cleanly. In my testing, these cases tend to
// happen when OpenAI has already send the last `response`, so we can ignore
// the `fetch` error in this case.
return {
response,
conversationId,
messageId
}
}
return {
error: {
message: err.toString(),
statusCode: err.statusCode || err.status || err.response?.statusCode,
statusText: err.statusText || err.response?.statusText
},
conversationId,
messageId
}
}
async function* streamAsyncIterable<T>(stream: ReadableStream<T>) {
const reader = stream.getReader()
try {
while (true) {
const { done, value } = await reader.read()
if (done) {
return
}
yield value
}
} finally {
reader.releaseLock()
}
}
// @see https://github.com/rexxars/eventsource-parser
function createParser(onParse: EventSourceParseCallback): EventSourceParser {
// Processing state
let isFirstChunk: boolean
let buffer: string
let startingPosition: number
let startingFieldLength: number
// Event state
let eventId: string | undefined
let eventName: string | undefined
let data: string
reset()
return { feed, reset }
function reset(): void {
isFirstChunk = true
buffer = ''
startingPosition = 0
startingFieldLength = -1
eventId = undefined
eventName = undefined
data = ''
}
function feed(chunk: string): void {
buffer = buffer ? buffer + chunk : chunk
// Strip any UTF8 byte order mark (BOM) at the start of the stream.
// Note that we do not strip any non - UTF8 BOM, as eventsource streams are
// always decoded as UTF8 as per the specification.
if (isFirstChunk && hasBom(buffer)) {
buffer = buffer.slice(BOM.length)
}
isFirstChunk = false
// Set up chunk-specific processing state
const length = buffer.length
let position = 0
let discardTrailingNewline = false
// Read the current buffer byte by byte
while (position < length) {
// EventSource allows for carriage return + line feed, which means we
// need to ignore a linefeed character if the previous character was a
// carriage return
// @todo refactor to reduce nesting, consider checking previous byte?
// @todo but consider multiple chunks etc
if (discardTrailingNewline) {
if (buffer[position] === '\n') {
++position
}
discardTrailingNewline = false
}
let lineLength = -1
let fieldLength = startingFieldLength
let character: string
for (
let index = startingPosition;
lineLength < 0 && index < length;
++index
) {
character = buffer[index]
if (character === ':' && fieldLength < 0) {
fieldLength = index - position
} else if (character === '\r') {
discardTrailingNewline = true
lineLength = index - position
} else if (character === '\n') {
lineLength = index - position
}
}
if (lineLength < 0) {
startingPosition = length - position
startingFieldLength = fieldLength
break
} else {
startingPosition = 0
startingFieldLength = -1
}
parseEventStreamLine(buffer, position, fieldLength, lineLength)
position += lineLength + 1
}
if (position === length) {
// If we consumed the entire buffer to read the event, reset the buffer
buffer = ''
} else if (position > 0) {
// If there are bytes left to process, set the buffer to the unprocessed
// portion of the buffer only
buffer = buffer.slice(position)
}
}
function parseEventStreamLine(
lineBuffer: string,
index: number,
fieldLength: number,
lineLength: number
) {
if (lineLength === 0) {
// We reached the last line of this event
if (data.length > 0) {
onParse({
type: 'event',
id: eventId,
event: eventName || undefined,
data: data.slice(0, -1) // remove trailing newline
})
data = ''
eventId = undefined
}
eventName = undefined
return
}
const noValue = fieldLength < 0
const field = lineBuffer.slice(
index,
index + (noValue ? lineLength : fieldLength)
)
let step = 0
if (noValue) {
step = lineLength
} else if (lineBuffer[index + fieldLength + 1] === ' ') {
step = fieldLength + 2
} else {
step = fieldLength + 1
}
const position = index + step
const valueLength = lineLength - step
const value = lineBuffer
.slice(position, position + valueLength)
.toString()
if (field === 'data') {
data += value ? `${value}\n` : '\n'
} else if (field === 'event') {
eventName = value
} else if (field === 'id' && !value.includes('\u0000')) {
eventId = value
} else if (field === 'retry') {
const retry = parseInt(value, 10)
if (!Number.isNaN(retry)) {
onParse({ type: 'reconnect-interval', value: retry })
}
}
}
}
function hasBom(buffer: string) {
return BOM.every(
(charCode: number, index: number) => buffer.charCodeAt(index) === charCode
)
}
/**
TODO: Remove AbortError and just throw DOMException when targeting Node 18.
*/
function getDOMException(errorMessage) {
return globalThis.DOMException === undefined
? new AbortError(errorMessage)
: new DOMException(errorMessage)
}
/**
TODO: Remove below function and just 'reject(signal.reason)' when targeting Node 18.
*/
function getAbortedReason(signal) {
const reason =
signal.reason === undefined
? getDOMException('This operation was aborted.')
: signal.reason
return reason instanceof Error ? reason : getDOMException(reason)
}
// @see https://github.com/sindresorhus/p-timeout
function pTimeout<ValueType, ReturnType = ValueType>(
promise: PromiseLike<ValueType>,
options: PTimeoutTypes.Options<ReturnType>
): PTimeoutTypes.ClearablePromise<ValueType | ReturnType> {
const {
milliseconds,
fallback,
message,
customTimers = { setTimeout, clearTimeout }
} = options
let timer: number
const cancelablePromise = new Promise((resolve, reject) => {
if (typeof milliseconds !== 'number' || Math.sign(milliseconds) !== 1) {
throw new TypeError(
`Expected \`milliseconds\` to be a positive number, got \`${milliseconds}\``
)
}
if (milliseconds === Number.POSITIVE_INFINITY) {
resolve(promise)
return
}
if (options.signal) {
const { signal } = options
if (signal.aborted) {
reject(getAbortedReason(signal))
}
signal.addEventListener('abort', () => {
reject(getAbortedReason(signal))
})
}
timer = customTimers.setTimeout.call(
undefined,
() => {
if (fallback) {
try {
resolve(fallback())
} catch (error) {
reject(error)
}
return
}
const errorMessage =
typeof message === 'string'
? message
: `Promise timed out after ${milliseconds} milliseconds`
const timeoutError =
message instanceof Error ? message : new TimeoutError(errorMessage)
if (typeof (promise as any).cancel === 'function') {
;(promise as any).cancel()
}
reject(timeoutError)
},
milliseconds
)
;(async () => {
try {
resolve(await promise)
} catch (error) {
reject(error)
} finally {
customTimers.clearTimeout.call(undefined, timer)
}
})()
})
;(cancelablePromise as any).clear = () => {
customTimers.clearTimeout.call(undefined, timer)
timer = undefined
}
return cancelablePromise as any
}
export function isValidUUIDv4(str: string): boolean {
return str && uuidv4Re.test(str)
}

Wyświetl plik

@ -1 +0,0 @@
const VERSION="chrome",browser=globalThis.chrome;function reconnect_scripts(){browser.runtime.onInstalled.addListener(async()=>{for(const e of browser.runtime.getManifest().content_scripts)for(const r of await browser.tabs.query({url:e.matches}))browser.scripting.executeScript({target:{tabId:r.id},files:e.js})})}function register_language(){browser.declarativeNetRequest.updateDynamicRules({addRules:[{id:1,priority:1,action:{type:"redirect",redirect:{transform:{queryTransform:{addOrReplaceParams:[{key:"hl",value:"en-US"}]}}}},condition:{regexFilter:"^(http|https)://[^\\.]*\\.(google\\.com|recaptcha\\.net)/recaptcha",resourceTypes:["sub_frame"]}},{id:2,priority:1,action:{type:"redirect",redirect:{transform:{queryTransform:{addOrReplaceParams:[{key:"lang",value:"en"}]}}}},condition:{regexFilter:"^(http|https)://[^\\.]*\\.(funcaptcha\\.(co|com)|arkoselabs\\.(com|cn)|arkose\\.com\\.cn)",resourceTypes:["sub_frame"]}}],removeRuleIds:[1,2]})}export{VERSION,browser,reconnect_scripts,register_language};

Wyświetl plik

@ -1 +0,0 @@
(async()=>{let i=null;function a(a=500){return new Promise(t=>{let c=!1;const n=setInterval(async()=>{if(!c){c=!0;var a=await BG.exec("Settings.get");if(a.enabled&&a.awscaptcha_auto_solve){a=document.querySelector('input[placeholder="Answer"]');if(a&&""===a.value){var e=function(){try{return document.querySelector("audio").src.replace("data:audio/aac;base64,","")}catch(a){}return null}();if(e&&i!==e)return i=e,clearInterval(n),c=!1,t({input:a,audio_data:e})}c=!1}}},a)})}for(;;){await Time.sleep(1e3);var e=await BG.exec("Settings.get");if(e&&e.enabled){var t,c,n,o,l=await Location.hostname();if(!e.disabled_hosts.includes(l))if(e.awscaptcha_auto_open&&null!==document.querySelector("#captcha-container > #root #amzn-captcha-verify-button")){l=void 0;try{var l=document.querySelector("#captcha-container > #root #amzn-captcha-verify-button");l&&l.click()}catch(a){}await 0}else if(e.hcaptcha_auto_solve&&null!==document.querySelector('#captcha-container > #root #amzn-btn-audio-internal > img[title="Audio problem"]')){l=void 0;try{l=document.querySelector("#captcha-container > #root #amzn-btn-audio-internal");l&&l.click()}catch(a){}await 0}else e.hcaptcha_auto_solve&&null!==document.querySelector('#captcha-container > #root #amzn-btn-audio-internal > img[title="Visual problem"]')&&(n=c=t=o=e=l=void 0,{input:l,audio_data:e}=await a(),await!(null!==l&&null!==e&&(o=await BG.exec("Settings.get")).enabled&&o.awscaptcha_auto_solve&&(t=Time.time(),{job_id:c,data:e}=await NopeCHA.post({captcha_type:IS_DEVELOPMENT?"awscaptcha_dev":"awscaptcha",audio_data:[e],key:o.key}),!e||0===e.length||(n=(n=parseInt(o.awscaptcha_solve_delay_time))||1e3,0<(o=o.awscaptcha_solve_delay?n-(Time.time()-t):0)&&await Time.sleep(o),0===e[0].length)?(document.querySelector("#amzn-btn-refresh-internal")?.click(),await Time.sleep(200),i=null):(l.value=e[0],await Time.sleep(200),document.querySelector("#amzn-btn-verify-internal")?.click()))))}}})();

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -1 +0,0 @@
function sleep(t){return new Promise(e=>setTimeout(t))}class BG{static exec(){return new Promise(t=>{try{chrome.runtime.sendMessage([...arguments],t)}catch(e){sleep(1e3).then(()=>{t(null)})}})}}class Net{static async fetch(e,t){return BG.exec("Net.fetch",{url:e,options:t})}}class Script{static inject_file(a){return new Promise(e=>{var t=document.createElement("script");t.src=chrome.runtime.getURL(a),t.onload=e,(document.head||document.documentElement).appendChild(t)})}}class Location{static parse_hostname(e){return e.replace(/^(.*:)\/\/([A-Za-z0-9\-\.]+)(:[0-9]+)?(.*)$/,"$2")}static async hostname(){var e=await BG.exec("Tab.info"),e=e.url||"Unknown Host";return Location.parse_hostname(e)}}class Image{static encode(t){return new Promise(a=>{if(null===t)return a(null);const e=new XMLHttpRequest;e.onload=()=>{const t=new FileReader;t.onloadend=()=>{let e=t.result;if(e.startsWith("data:text/html;base64,"))return a(null);e=e.replace("data:image/jpeg;base64,",""),a(e)},t.readAsDataURL(e.response)},e.onerror=()=>{a(null)},e.onreadystatechange=()=>{4==this.readyState&&200!=this.status&&a(null)},e.open("GET",t),e.responseType="blob",e.send()})}}class NopeCHA{static INFERENCE_URL="https://dev-api.nopecha.com";static MAX_WAIT_POST=60;static MAX_WAIT_GET=60;static ERRORS={UNKNOWN:9,INVALID_REQUEST:10,RATE_LIIMTED:11,BANNED_USER:12,NO_JOB:13,INCOMPLETE_JOB:14,INVALID_KEY:15,NO_CREDIT:16,UPDATE_REQUIRED:17};static async post({captcha_type:e,task:t,image_urls:a,image_data:r,grid:n,audio_data:o,key:i}){for(var s=Date.now(),c=await BG.exec("Tab.info");!(Date.now()-s>1e3*NopeCHA.MAX_WAIT_POST);){var l={type:e,task:t,key:i,v:chrome.runtime.getManifest().version,url:c?c.url:window.location.href};a&&(l.image_urls=a),r&&(l.image_data=r),n&&(l.grid=n),o&&(l.audio_data=o);try{var d={"Content-Type":"application/json"},u=(i&&"undefined"!==i&&(d.Authorization="Bearer "+i),await Net.fetch(BASE_API,{method:"POST",headers:d,body:JSON.stringify(l)})),p=JSON.parse(u);if("error"in p){if(p.error===NopeCHA.ERRORS.RATE_LIMITED){await Time.sleep(2e3);continue}if(p.error===NopeCHA.ERRORS.INVALID_KEY)break;if(p.error===NopeCHA.ERRORS.NO_CREDIT)break;break}var _=p.data;return await NopeCHA.get({job_id:_,key:i})}catch(e){}}return{job_id:null,data:null}}static async get({job_id:e,key:t}){for(var a=Date.now();!(Date.now()-a>1e3*NopeCHA.MAX_WAIT_GET);){await Time.sleep(1e3);var r={},r=(t&&"undefined"!==t&&(r.Authorization="Bearer "+t),await Net.fetch(BASE_API+`?id=${e}&key=`+t,{headers:r}));try{var n=JSON.parse(r);if(!("error"in n))return{job_id:e,data:n.data,metadata:n.metadata};if(n.error!==NopeCHA.ERRORS.INCOMPLETE_JOB)return{job_id:e,data:null,metadata:null}}catch(e){}}return{job_id:e,data:null,metadata:null}}}

Plik binarny nie jest wyświetlany.

Plik binarny nie jest wyświetlany.

Plik binarny nie jest wyświetlany.

Wyświetl plik

@ -1 +0,0 @@
(async()=>{function l(e,t=!1){if(t)for(const c of e){var a=document.querySelectorAll(c);if(6===a.length)return a}else for(const i of e){var n=document.querySelector(i);if(n)return n}return null}function r(){return null!==l(['button[aria-describedby="descriptionVerify"]','button[data-theme="home.verifyButton"]',"#wrong_children_button","#wrongTimeout_children_button"])}function u(){try{var e=l(['button[aria-describedby="descriptionVerify"]','button[data-theme="home.verifyButton"]']),t=(e&&(window.parent.postMessage({nopecha:!0,action:"clear"},"*"),e.click()),document.querySelector("#wrong_children_button")),a=(t&&(window.parent.postMessage({nopecha:!0,action:"clear"},"*"),t.click()),document.querySelector("#wrongTimeout_children_button"));a&&(window.parent.postMessage({nopecha:!0,action:"clear"},"*"),a.click())}catch(e){}}function s(){return l(["#game_children_text > h2",".challenge-instructions-container > h2"])?.innerText?.trim()}function h(){let e=l(["img#game_challengeItem_image"]);var t;return e?e.src?.split(";base64,")[1]:(t=(e=l([".challenge-container button"]))?.style["background-image"]?.trim()?.match(/(?!^)".*?"/g))&&0!==t.length?t[0].replaceAll('"',""):null}let d=null;async function e(){e=500;var e,{task:t,cells:a,image_data:n}=await new Promise(n=>{let c=!1;const i=setInterval(async()=>{if(!c){c=!0;var e=await BG.exec("Settings.get");if(e&&e.enabled&&e.funcaptcha_auto_solve){e.funcaptcha_auto_open&&r()&&await u();e=s();if(e){var t=l(["#game_children_challenge ul > li > a",".challenge-container button"],!0);if(6===t.length){var a=h();if(a&&d!==a)return d=a,clearInterval(i),c=!1,n({task:e,cells:t,image_data:a})}}c=!1}}},e)});if(null!==t&&null!==a&&null!==n){var c=await BG.exec("Settings.get");if(c&&c.enabled&&c.funcaptcha_auto_solve){var i=Time.time(),o=(await NopeCHA.post({captcha_type:IS_DEVELOPMENT?"funcaptcha_dev":"funcaptcha",task:t,image_data:[n],key:c.key}))["data"];if(o){t=parseInt(c.funcaptcha_solve_delay_time)||1e3,n=c.funcaptcha_solve_delay?t-(Time.time()-i):0;0<n&&await Time.sleep(n);for(let e=0;e<o.length;e++)!1!==o[e]&&a[e].click()}d=null}}}if(setInterval(()=>{document.dispatchEvent(new Event("mousemove"))},50),window.location.pathname.startsWith("/fc/assets/tile-game-ui/")||window.location.pathname.startsWith("/fc/assets/ec-game-core/"))for(;;){await Time.sleep(1e3);var t,a=await BG.exec("Settings.get");a&&a.enabled&&(t=await Location.hostname(),a.disabled_hosts.includes(t)||(a.funcaptcha_auto_open&&r()?await u():a.funcaptcha_auto_solve&&null!==s()&&null!==h()&&await e()))}})();

Wyświetl plik

@ -1,63 +0,0 @@
(async()=>{const u={linkedin:["3117BF26-4762-4F5A-8ED9-A85E69209A46",!1],rockstar:["A5A70501-FCDE-4065-AF18-D9FAF06EF479",!1],github:["20782B4C-05D0-45D7-97A0-41641055B6F6",!1],paypal:["9409E63B-D2A5-9CBD-DBC0-5095707D0090",!1],blizzard:["E8A75615-1CBA-5DFF-8032-D16BCF234E10",!1],twitch:["E5554D43-23CC-1982-971D-6A2262A2CA24",!1],demo1:["804380F4-6844-FFA1-ED4E-5877CA1F1EA4",!1],demo2:["D39B0EE3-2973-4147-98EF-C92F93451E2D",!1],"ea signup":["73BEC076-3E53-30F5-B1EB-84F494D43DBA",!1],"ea signin":["0F5FE186-B3CA-4EDB-A39B-9B9A3397D01D",!1],myprepaidcenter:["0F941BF0-7303-D94B-B76A-EAA2E2048124",!1],twitter:["2CB16598-CB82-4CF7-B332-5990DB66F3AB",!0],discoveryplus:["FE296399-FDEA-2EA2-8CD5-50F6E3157ECA",!1],minecraft:["D39B0EE3-2973-4147-98EF-C92F93451E2D",!1],imvu:["0C2B415C-D772-47D4-A183-34934F786C7E",!1],adobe:["430FF2C3-1AB1-40B7-8BE7-44FC683FE02C",!1]},h={outlook:["https://iframe.arkoselabs.com/B7D8911C-5CC8-A9A3-35B0-554ACEE604DA/index.html?mkt=en",!1],"outlook auth":["https://iframe-auth.arkoselabs.com/B7D8911C-5CC8-A9A3-35B0-554ACEE604DA/index.html?mkt=en",!1]};let E=1;function w(){g("linkedin",0,1),g("rockstar",0,1),g("demo1",0,1),g("blizzard",0,1),g("twitch",0,1),g("paypal",0,1),A("outlook auth",0,1),g("github",0,1),g("demo2",0,1),A("outlook",0,1),g("ea signup",0,1),g("ea signin",0,1),g("twitter",0,1),g("minecraft",0,1),g("imvu",0,1),g("adobe",0,1)}function g(t,o,n){n=n||E;for(let e=0;e<n;e++)!async function(e,t){var o=u[e][0],n="https://api.funcaptcha.com/fc/gt2/public_key/"+o,n=await Net.fetch(n,{headers:{accept:"*/*","accept-language":"en-US,en;q=0.9","cache-control":"no-cache","content-type":"application/x-www-form-urlencoded; charset=UTF-8",pragma:"no-cache","sec-ch-ua":'"Google Chrome";v="105", "Not)A;Brand";v="8", "Chromium";v="105"',"sec-ch-ua-mobile":"?0","sec-ch-ua-platform":'"Linux"',"sec-fetch-dest":"empty","sec-fetch-mode":"cors","sec-fetch-site":"cross-site"},referrer:"",referrerPolicy:"strict-origin-when-cross-origin",body:`bda=&public_key=${o}&site=${encodeURIComponent("")}&language=en&userbrowser=Mozilla%2F5.0%20(X11%3B%20Linux%20x86_64)%20AppleWebKit%2F537.36%20(KHTML%2C%20like%20Gecko)%20Chrome%2F105.0.0.0%20Safari%2F537.36&rnd=`+Math.random(),method:"POST",mode:"cors",credentials:"omit"}),o=JSON.parse(n),r={};for(const i of o.token.split("|")){var a=i.split("=");let e=a[0],t=a[1];a[1]||(e="token",t=a[0]),e.endsWith("url")&&(t=decodeURIComponent(t)),r[e]=t}n=new URLSearchParams(r).toString(),o="https://api.funcaptcha.com/fc/gc/?"+n;c(e,t,o,u[e][1])}(t,o)}function A(t,o,n){n=n||E;for(let e=0;e<n;e++)c(t,o,h[t][0],h[t][1])}function c(e,t,o,n=!1){var r=document.createElement("div"),a=(r.classList.add("iframe_wrap"),document.createElement("iframe"));n&&a.classList.add("small"),r.append(a),a.frameborder=0,a.scrolling="no",a.src=o;let i=document.querySelector("#iframe_row_"+t);i||((i=document.createElement("div")).classList.add("iframe_row"),i.id="iframe_row_"+t,document.body.append(i));n=document.createElement("div"),n.classList.add("name"),n.innerHTML=e,a=document.createElement("div");a.append(n),a.append(r),i.append(a)}!function e(){document.body.innerHTML="";const t=[`body, html {
background-color: #212121;
}`,`.input_row {
display: flex;
flex-direction: row;
flex-wrap: wrap;
justify-content: center;
}`,`.input_row > * {
height: 20px;
line-height: 20px;
padding: 0;
border: 0;
font-size: 12px;
}`,`.input_row > input[type="button"] {
width: 100px;
cursor: pointer;
transition: 200ms all;
}`,`.input_row > input[type="button"]:hover {
opacity: 0.8;
}`,`#nframes_label {
background-color: #fff;
color: #222;
width: 70px;
text-align: center;
}`,`#nframes, #nframes:active {
width: 30px;
border: none;
outline: none;
}`,`.name {
color: #fff;
}`,`.iframe_row {
display: flex;
flex-direction: row;
flex-wrap: wrap;
justify-content: center;
}`,`.iframe_wrap {
background-color: #eee;
width: 275px;
height: 275px;
padding: 0;
overflow: hidden;
}`,`iframe {
border: none !important;
width: 400px !important;
height: 400px !important;
-ms-zoom: 0.75 !important;
-moz-transform: scale(0.75) !important;
-moz-transform-origin: 0 0 !important;
-o-transform: scale(0.75) !important;
-o-transform-origin: 0 0 !important;
-webkit-transform: scale(0.75) !important;
-webkit-transform-origin: 0 0 !important;
}`,`iframe.small {
width: 550px !important;
height: 550px !important;
-ms-zoom: 0.5 !important;
-moz-transform: scale(0.5) !important;
-moz-transform-origin: 0 0 !important;
-o-transform: scale(0.5) !important;
-o-transform-origin: 0 0 !important;
-webkit-transform: scale(0.5) !important;
-webkit-transform-origin: 0 0 !important;
}`];const o=document.body.appendChild(document.createElement("style")).sheet;for(const n in t)o.insertRule(t[n],n);let n=0;let r=1;const a={};a[0]=document.createElement("div");a[0].classList.add("input_row");document.body.append(a[0]);const i=document.createElement("div");i.id="nframes_label";i.innerText="# iframes";a[0].append(i);const c=document.createElement("input");c.id="nframes";c.placeholder="Number of iframes";c.value=E;c.addEventListener("input",()=>{E=parseInt(c.value)});a[0].append(c);const s={reset:{row:0,fn:e,args:[]},all:{row:0,fn:w,args:[]}};for(const m in u)n++%9==0&&r++,s[m]={row:r,fn:g,args:[m,0]};for(const d in h)n++%9==0&&r++,s[d]={row:r,fn:A,args:[d,0]};for(const[p,l]of Object.entries(s)){const r=l.row,f=(l.row in a||(a[l.row]=document.createElement("div"),a[l.row].classList.add("input_row"),document.body.append(a[l.row])),document.createElement("input"));f.type="button",f.value=p,f.addEventListener("click",()=>{e(),l.fn(...l.args)}),a[l.row].append(f)}}(),A("outlook",0,E)})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{window.addEventListener("load",()=>{var t=document.body.appendChild(document.createElement("style")).sheet;t.insertRule("* {transition-duration: 0s !important}",0),t.insertRule("li > a::after {border: 8px solid rgba(0, 255, 0, 0.6) !important}",1),t.insertRule("#interstitial {backdrop-filter: none !important}",2),t.insertRule("#interstitial {background-color: transparent !important}",3),t.insertRule("#interstitial_wrapper {background-color: transparent !important}",4)})})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{var e=IS_DEVELOPMENT;const o="lazy";window.nopecha=[];var a={};async function t(e){var a=(document.querySelector("#game_children_text > h2")||document.querySelector("#game-header"))?.innerText?.trim(),t=(document.querySelector("img#game_challengeItem_image")||document.querySelector("#challenge-image"))?.src?.split(";base64,")[1];a&&t&&(a={task:a,image:t,index:e,url:(await BG.exec("Tab.info"))?.url},o.startsWith("l")&&window.parent.postMessage({nopecha:!0,action:"append",data:a},"*"),o.startsWith("e"))&&await Net.fetch("https://api.nopecha.com/upload",{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify(a)})}var n=window.addEventListener?"addEventListener":"attachEvent";for(window[n]("attachEvent"==n?"onmessage":"message",async e=>{e=e[e.message?"message":"data"];e&&!0===e.nopecha&&("append"===e.action?window.nopecha.push(e.data):"clear"===e.action?window.nopecha=[]:"reload"===e.action&&(window.parent.postMessage({nopecha:!0,action:"reload"},"*"),window.location.reload(!0)))},!1);;){await Time.sleep(1e3);try{if(document.querySelector("body.victory")){var i=[];for(const s of window.nopecha){var c=Net.fetch("https://api.nopecha.com/upload",{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify(s)});i.push(c)}await Promise.all(i),window.nopecha=[],e&&(window.parent.postMessage({nopecha:!0,action:"reload"},"*"),window.location.reload(!0))}"block"===document.querySelector("#timeout_widget")?.style?.display&&(window.parent.postMessage({nopecha:!0,action:"reload"},"*"),window.location.reload(!0));var r=document.querySelectorAll("#game_children_challenge ul > li > a");for(const l in r){var d=r[l];l in a&&d.removeEventListener("click",a[l]),a[l]=t.bind(this,parseInt(l)),d.addEventListener("click",a[l])}}catch(e){}}})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{function u(e){e=e?.style.background?.trim()?.match(/(?!^)".*?"/g);return e&&0!==e.length?e[0].replaceAll('"',""):null}async function h(){var e=document.querySelector("h2.prompt-text")?.innerText?.replace(/\s+/g," ")?.trim();if(!e)return null;var t={"0430":"a","0441":"c","0501":"d","0065":"e","0435":"e","04bb":"h","0069":"i","0456":"i","0458":"j","03f3":"j","04cf":"l","03bf":"o","043e":"o","0440":"p","0455":"s","0445":"x","0443":"y","0335":"-"};var a=[];for(const i of e){var c=function(e,t,a){for(;(""+e).length<a;)e=""+t+e;return e}(i.charCodeAt(0).toString(16),"0",4);a.push(c in t?t[c]:i)}return a.join("")}let d=null;async function e(){"block"===document.querySelector("div.check")?.style.display?a=a||!0:(a=!1,await Time.sleep(500),document.querySelector("#checkbox")?.click())}async function t(){"EN"!==document.querySelector(".display-language .text").textContent&&(document.querySelector(".language-selector .option:nth-child(23)").click(),await Time.sleep(500));e=500;var e,{task:t,cells:a,urls:c}=await new Promise(o=>{let r=!1;const s=setInterval(async()=>{if(!r){r=!0;var e=await h();if(e){var t=u(document.querySelector(".challenge-example > .image > .image"));if(t&&""!==t){var a=document.querySelectorAll(".task-image");if(9===a.length){var c=[],i=[];for(const l of a){var n=l.querySelector("div.image");if(!n)return void(r=!1);n=u(n);if(!n||""===n)return void(r=!1);c.push(l),i.push(n)}a=JSON.stringify(i);if(d!==a)return d=a,clearInterval(s),r=!1,o({task:e,task_url:t,cells:c,urls:i})}}}r=!1}},e)}),i=await BG.exec("Settings.get");if(i&&i.enabled&&i.hcaptcha_auto_solve){var n=Time.time(),{data:l,metadata:t}=await NopeCHA.post({captcha_type:IS_DEVELOPMENT?"hcaptcha_dev":"hcaptcha",task:t,image_urls:c,key:i.key});if(l){o&&o.postMessage({event:"NopeCHA.metadata",metadata:t});c=parseInt(i.hcaptcha_solve_delay_time)||3e3,t=i.hcaptcha_solve_delay?c-(Time.time()-n):0;0<t&&await Time.sleep(t);for(let e=0;e<l.length;e++)!1!==l[e]&&"true"!==a[e].getAttribute("aria-pressed")&&a[e].click();await Time.sleep(200);try{document.querySelector(".button-submit").click()}catch(e){}}}}let a=!1,c=!1,o=null;for(;;){await Time.sleep(1e3);var i,n=await BG.exec("Settings.get");n&&n.enabled&&(i=await Location.hostname(),n.disabled_hosts.includes(i)||(c||null!==o||(window.addEventListener("message",e=>{"NopeCHA.hook"===e.data.event&&(o=e.source)}),window.location.hash.includes("frame=challenge")&&(c=!0,"firefox"===await BG.exec("Browser.version")?await Script.inject_file("hcaptcha_hook.js"):await BG.exec("Inject.files",{files:["hcaptcha_hook.js"]}))),n.hcaptcha_auto_open&&0!==document.body.getBoundingClientRect()?.width&&0!==document.body.getBoundingClientRect()?.height&&null!==document.querySelector("div.check")?await e():n.hcaptcha_auto_solve&&null!==document.querySelector("h2.prompt-text")&&await t()))}})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{let a=null,t=!1,r=!1;function n(e,t,r=!1){e&&(r||a!==e)&&(!0===t&&"false"===e.getAttribute("aria-pressed")||!1===t&&"true"===e.getAttribute("aria-pressed"))&&e.click()}document.addEventListener("mousedown",e=>{"false"===e?.target?.parentNode?.getAttribute("aria-pressed")?(t=!0,r=!0):"true"===e?.target?.parentNode?.getAttribute("aria-pressed")&&(t=!0,r=!1),a=e?.target?.parentNode}),document.addEventListener("mouseup",e=>{t=!1,a=null}),document.addEventListener("mousemove",e=>{t&&(a!==e?.target?.parentNode&&null!==a&&n(a,r,!0),n(e?.target?.parentNode,r))}),window.addEventListener("load",()=>{document.body.appendChild(document.createElement("style")).sheet.insertRule('[aria-pressed="true"] > .border-focus {background-color: #0f0 !important; opacity: 0.3 !important}',0)})})();

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -1 +0,0 @@
(()=>{let e;function t(){var e=navigator.language.split("-")[0];for(const r of document.querySelectorAll('script[src*="hcaptcha.com/1/api.js"]')){var t=new URL(r.src);"en"!==(t.searchParams.get("hl")||e)&&(t.searchParams.set("hl","en"),r.src=t.toString())}}e=new MutationObserver(t),setTimeout(()=>{t(),e.observe(document.head,{childList:!0})},0)})();

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 14 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 11 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 5.8 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 5.2 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 7.6 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 5.9 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 12 KiB

Plik binarny nie jest wyświetlany.

Przed

Szerokość:  |  Wysokość:  |  Rozmiar: 9.4 KiB

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -1,3 +0,0 @@
{
"update_url": "https://clients2.google.com/service/update2/crx",
"name": "NopeCHA: CAPTCHA Solver", "version": "0.3.4", "description": "Automatically solve reCAPTCHA, hCaptcha, FunCAPTCHA, AWS WAF, and text CAPTCHA using AI.", "permissions": ["declarativeNetRequest", "storage", "scripting", "contextMenus", "webRequest"], "content_scripts": [{"matches": ["<all_urls>"], "js": ["utils.js", "content.js"], "run_at": "document_start", "all_frames": true, "match_about_blank": true}, {"matches": ["*://nopecha.com/setup"], "js": ["setup.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": false}, {"matches": ["*://*.hcaptcha.com/captcha/*"], "js": ["hcaptcha.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": false}, {"matches": ["*://*.hcaptcha.com/captcha/*"], "js": ["hcaptcha_fast.js"], "run_at": "document_start", "all_frames": true, "match_about_blank": false}, {"matches": ["<all_urls>"], "js": ["hcaptcha_language.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": false}, {"matches": ["<all_urls>"], "js": ["recaptcha.js", "recaptcha_speech.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": false}, {"matches": ["*://*.google.com/recaptcha/*", "*://*.recaptcha.net/recaptcha/*", "*://recaptcha.net/recaptcha/*"], "js": ["recaptcha_fast.js"], "run_at": "document_start", "all_frames": true, "match_about_blank": false}, {"matches": ["*://*.arkoselabs.com/fc/*", "*://*.funcaptcha.com/fc/*"], "js": ["funcaptcha.js", "funcaptcha_scrape.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": true}, {"matches": ["*://*.arkoselabs.com/fc/*", "*://*.funcaptcha.com/fc/*"], "js": ["funcaptcha_fast.js"], "run_at": "document_start", "all_frames": true, "match_about_blank": true}, {"matches": ["*://nopecha.com/demo/funcaptcha"], "js": ["funcaptcha_demo.js"], "run_at": "document_end", "all_frames": false, "match_about_blank": false}, {"matches": ["<all_urls>"], "js": ["awscaptcha.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": false}, {"matches": ["<all_urls>"], "js": ["textcaptcha.js", "locate.js"], "run_at": "document_end", "all_frames": true, "match_about_blank": true}], "icons": {"16": "icon/16.png", "32": "icon/32.png", "48": "icon/48.png", "128": "icon/128.png"}, "manifest_version": 3, "action": {"default_title": "NopeCHA: CAPTCHA Solver", "default_icon": "icon/16.png", "default_popup": "popup.html"}, "background": {"service_worker": "background.js", "type": "module"}, "host_permissions": ["<all_urls>"]}

Wyświetl plik

@ -1,801 +0,0 @@
@font-face {
font-family: 'plex-sans';
font-style: normal;
font-weight: 700;
src: url('font/plex-sans-bold.woff2') format('woff2'), url('font/plex-sans-bold.woff') format('woff');
}
@font-face {
font-family: 'plex-sans';
font-style: normal;
font-weight: 400;
src: url('font/plex-sans-regular.woff2') format('woff2'), url('font/plex-sans-regular.woff') format('woff');
}
* {
font-family: 'plex-sans';
box-sizing: border-box;
outline: none;
}
html {
width: 340px;
}
body {
width: 324px;
}
html, body {
background: #1a2432;
color: #fff;
line-height: 1.15;
text-size-adjust: 100%;
}
div {
display: block;
}
a {
text-decoration: none;
}
button, input, optgroup, select, textarea {
font-family: inherit;
font-size: 100%;
line-height: 1.15;
margin: 0px;
}
button, select {
text-transform: none;
}
button, input {
overflow: visible;
}
input {
writing-mode: horizontal-tb !important;
font-style: ;
font-variant-ligatures: ;
font-variant-caps: ;
font-variant-numeric: ;
font-variant-east-asian: ;
font-weight: ;
font-stretch: ;
font-size: ;
font-family: ;
text-rendering: auto;
color: fieldtext;
letter-spacing: normal;
word-spacing: normal;
line-height: normal;
text-transform: none;
text-indent: 0px;
text-shadow: none;
display: inline-block;
text-align: start;
appearance: auto;
-webkit-rtl-ordering: logical;
cursor: text;
background-color: field;
margin: 0em;
padding: 1px 2px;
border-width: 2px;
border-style: inset;
border-color: -internal-light-dark(rgb(118, 118, 118), rgb(133, 133, 133));
border-image: initial;
}
.text_input {
background-color: transparent;
padding: 8px 8px 8px 16px;
color: rgb(255, 255, 255);
outline: none;
border: none;
width: 100%;
font-size: 14px;
}
.text_input.small {
width: 30%;
}
.text_input.text_right {
text-align: right;
}
.hidden {
display: none !important;
}
.hiddenleft {
transform: translateX(-100%) translateZ(0px);
}
.red {
color: #ff6961 !important;
}
/* Remove arrows from number input */
input::-webkit-outer-spin-button,
input::-webkit-inner-spin-button {
-webkit-appearance: none;
margin: 0;
}
input[type=number] {
-moz-appearance: textfield;
}
/* SCROLLBAR */
::-webkit-scrollbar {
width: 6px;
right: 2px;
bottom: 2px;
top: 2px;
border-radius: 3px;
}
::-webkit-scrollbar-track {
background: transparent;
}
::-webkit-scrollbar-thumb {
background: rgba(255, 255, 255, 0.2);
}
/* LOADING OVERLAY */
#loading_overlay {
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
position: absolute;
top: 0;
bottom: 0;
left: 0;
right: 0;
background: #222;
z-index: 10;
}
#loading_overlay .loading_text {
margin-top: 8px;
font-size: 14px;
text-align: center;
}
#loading_overlay .loading_text.timeout > div {
opacity: 0;
animation: fadein 10s linear forwards;
}
#loading_overlay .loading_text.timeout > div:nth-child(1) {
animation-delay: 2000ms;
}
#loading_overlay .loading_text.timeout > div:nth-child(2) {
animation-delay: 4000ms;
}
#loading_overlay .loading_text.timeout > div:nth-child(3) {
animation-delay: 6000ms;
}
@keyframes fadein {
0% {opacity: 0;}
50% {opacity: 0;}
100% {opacity: 1;}
}
/* MISC */
.clickable {
cursor: pointer !important;
}
.clickable:hover {
opacity: 0.8 !important;
}
/* APP */
#app_frame {
display: flex;
flex-direction: column;
overflow: hidden;
transition: height ease 0.2s, min-height ease 0.2s;
min-height: 237px !important;
}
/* HEADER */
.header {
box-sizing: border-box;
padding: 16px;
display: flex;
place-content: space-between;
font-weight: 400;
}
.header.spacedright {
margin-right: 32px;
}
.nav_icon {
border: none;
cursor: pointer;
display: flex;
-webkit-box-align: center;
align-items: center;
-webkit-box-pack: center;
justify-content: center;
background: rgba(2, 13, 28, 0.05);
border-radius: 50%;
width: 32px;
height: 32px;
position: relative;
transition: all 0.3s ease 0s;
fill: rgba(255, 255, 255, 1);
background: rgba(255, 255, 255, 0.1) !important;
}
.nav_icon:hover {
opacity: 0.9;
}
.nav_icon:disabled,
.nav_icon:disabled:hover {
background: none !important;
cursor: unset;
opacity: 0.9;
}
.header_label_container {
box-sizing: border-box;
margin-right: 0px;
display: flex;
flex: 1 1 0%;
-webkit-box-pack: center;
justify-content: center;
-webkit-box-align: center;
align-items: center;
}
.header_label {
box-sizing: border-box;
font-size: 24px;
font-weight: bold;
display: flex;
color: rgb(255, 255, 255);
}
/* PLAN */
.plan_info_box {
position: relative;
width: 100%;
height: 100%;
}
.plan_info_container {
display: flex;
box-sizing: border-box;
position: relative;
}
.plan_info {
box-sizing: border-box;
width: 100%;
padding: 0px 16px 16px;
display: flex;
}
.plan_label {
box-sizing: border-box;
font-weight: bold;
font-size: 14px;
color: rgb(255, 255, 255);
}
.plan_value {
box-sizing: border-box;
margin-left: auto;
display: flex;
}
.plan_button {
display: flex;
background-color: transparent;
color: rgba(255, 255, 255, 0.9);
width: auto;
padding: 0px;
border: none;
-webkit-box-align: center;
align-items: center;
-webkit-box-pack: center;
justify-content: center;
transition: color 0.3s ease 0s, transform 0.1s ease-out 0s, opacity 0.3s ease 0s;
}
.plan_button.link {
color: #0a95ff;
}
.plan_button.link,
.plan_button.link:hover,
.plan_button_label {
box-sizing: border-box;
font-size: 14px;
}
/* WARNING */
.warning_box {
display: flex;
flex-direction: column;
justify-content: center;
background: #1a2432;
position: absolute;
top: 0;
bottom: 8px;
left: 4px;
right: 4px;
border: 1px solid #FCD62E;
border-radius: 0.25rem;
padding: 0.5rem;
margin: 0 4px;
z-index: 1;
}
.warning_box * {
color: #fff;
font-size: 14px;
text-align: center;
}
/* KEY */
.key_label {
display: flex;
flex-direction: row;
flex-wrap: nowrap;
justify-content: space-between;
width: 100%;
}
.key_label > .instructions {
font-weight: normal;
line-height: 16px;
margin-left: 6px;
color: #fff;
font-size: 10px;
}
.settings_text[data-settings="key"] {
background: #1a2432;
position: absolute;
width: calc(100% - 32px);
transition: all ease 0.1s;
z-index: 1;
}
/* .edit_key {
line-height: 16px;
margin-right: 6px;
color: #fff;
font-size: 10px;
} */
.edit_icon {
z-index: 2;
}
/* MENU */
.menu {
box-sizing: border-box;
padding-left: 16px;
}
.menu_item_container {
border-top: none;
border-right: none;
border-left: none;
border-image: initial;
cursor: pointer;
display: flex;
-webkit-box-align: center;
align-items: center;
background-color: transparent;
width: 100%;
padding: 16px;
margin-top: 2px;
border-bottom: 2px solid rgba(255, 255, 255, 0.05);
color: rgba(255, 255, 255, 0.5);
transition: color 0.5s ease 0s, border 0.5s ease 0s;
-webkit-box-pack: justify !important;
justify-content: space-between !important;
}
.menu_item_container:hover {
color: rgb(255, 255, 255);
}
button.menu_item_container {
padding-left: 0px !important;
}
.button_label_container {
box-sizing: border-box;
-webkit-box-align: center;
align-items: center;
display: flex;
}
.button_label_container svg {
fill: rgb(255, 255, 255);
}
.button_label {
box-sizing: border-box;
margin-left: 16px;
font-size: 14px;
font-weight: bold;
}
.menu_item_arrow {
fill: rgb(255, 255, 255);
height: 16px;
width: 16px;
}
/* #export {
color: rgba(255, 255, 255, 0.5);
font-size: 1.2em;
cursor: pointer;
transition: color 0.5s ease 0s, border 0.5s ease 0s;
}
#export:hover {
color: rgb(255, 255, 255);
} */
/* TAB */
.bbflex {
box-sizing: border-box;
display: flex;
-webkit-box-align: center;
align-items: center;
}
.scrolling_container {
box-sizing: border-box;
margin-top: 8px;
margin-left: 16px;
margin-right: 16px;
padding-bottom: 16px;
}
.settings_item_container {
box-sizing: border-box;
margin-bottom: 8px;
border: 1px solid rgba(255, 255, 255, 0.08);
border-radius: 8px;
box-sizing: border-box;
}
.settings_item_container > a {
color: rgba(255, 255, 255, 0.5);
text-decoration: none;
transition: color 0.5s ease 0s, border 0.5s ease 0s;
}
.settings_item {
width: 100%;
background-color: rgba(255, 255, 255, 0.08);
min-height: 48px;
padding: 14px 16px 0px;
border-radius: 8px;
}
.settings_item > div {
-webkit-box-pack: justify;
justify-content: space-between;
-webkit-box-align: center;
align-items: center;
}
.settings_item_label {
font-size: 14px;
font-weight: 600;
color: rgb(255, 255, 255);
padding-left: 16px;
height: 20px;
-webkit-box-align: center;
align-items: center;
}
.settings_toggle {
height: 20px;
max-width: 36px;
min-width: 36px;
border-radius: 10px;
padding: 2px;
transition: background-color 0.3s ease 0s;
opacity: 1;
cursor: pointer;
}
.settings_toggle > div {
width: 16px;
height: 16px;
border-radius: 50%;
transform: translate(16px);
transition: transform 0.3s ease 0s, background-color 0.3s ease 0s;
}
.settings_toggle.on {
background-color: rgb(0, 106, 255);
}
.settings_toggle.off {
background-color: rgb(255, 255, 255);
}
.settings_toggle.on > div {
background-color: rgb(255, 255, 255);
transform: translate(16px);
}
.settings_toggle.off > div {
background-color: rgb(2, 13, 28);
transform: translate(0px);
}
.settings_description_container {
padding: 10px 16px 8px;
-webkit-box-pack: justify;
justify-content: space-between;
}
.settings_description {
font-size: 12px;
color: rgba(255, 255, 255, 0.5);
}
.settings_button {
color: rgba(255, 255, 255, 0.5);
font-size: 14px;
gap: 16px;
}
.settings_button > div {
cursor: pointer;
transition: all 0.3s ease 0s;
}
.settings_button > div:hover {
color: rgb(255, 255, 255);
}
.settings_dropdown_selected {
color: rgba(255, 255, 255, 0.5);
-webkit-box-align: center;
align-items: center;
font-size: 14px;
cursor: pointer;
white-space: nowrap;
}
.settings_dropdown_selected > div {
box-sizing: border-box;
margin-right: 8px;
}
.settings_dropdown_options {
position: relative;
transition: visibility 0.3s ease 0s, opacity 0.3s ease 0s;
opacity: 0;
visibility: hidden;
}
.settings_dropdown_options > div {
position: absolute;
background-color: rgb(255, 255, 255);
border-radius: 4px;
right: 0px;
top: 50%;
transform: translateY(-50%);
border: 1px solid rgba(0, 0, 0, 0.15);
width: auto;
min-width: 60px;
white-space: nowrap;
box-shadow: rgb(0 0 0 / 15%) 0px 2px 4px 0px;
box-sizing: border-box;
padding: 4px;
}
.settings_dropdown_selected:hover > .settings_dropdown_options {
opacity: 1;
visibility: visible;
}
.settings_dropdown_options > div > div {
color: rgba(0, 0, 0, 0.5);
font-weight: 700;
border-radius: 4px;
-webkit-box-pack: center;
justify-content: center;
-webkit-box-align: center;
align-items: center;
line-height: normal;
font-size: 12px;
height: 23px;
cursor: pointer;
padding: 0px 4px;
}
.settings_dropdown_options > div > div:hover {
background-color: rgba(0, 0, 0, 0.08);
}
.settings_dropdown_options > div > div.selected {
color: rgb(0, 106, 255);
}
/* FOOTER */
.footer {
display: flex;
flex-direction: row;
padding: 8px;
margin-top: 8px;
font-size: 10px;
}
.footer * {
color: rgba(255, 255, 255, 0.8);
}
.footer > *:nth-child(1) {
flex-grow: 1;
}
/* LOADING ANIM */
.loading {
display: inline-block;
position: relative;
width: 32px;
height: 16px;
}
.loading div {
position: absolute;
top: 5px;
width: 6px;
height: 6px;
border-radius: 50%;
background: rgba(255, 255, 255, 0.8);
animation-timing-function: cubic-bezier(0, 1, 1, 0);
}
.loading div:nth-child(1) {
left: 4px;
animation: loading1 0.6s infinite;
}
.loading div:nth-child(2) {
left: 4px;
animation: loading2 0.6s infinite;
}
.loading div:nth-child(3) {
left: 16px;
animation: loading2 0.6s infinite;
}
.loading div:nth-child(4) {
left: 28px;
animation: loading3 0.6s infinite;
}
@keyframes loading1 {
0% {
transform: scale(0);
}
100% {
transform: scale(1);
}
}
@keyframes loading3 {
0% {
transform: scale(1);
}
100% {
transform: scale(0);
}
}
@keyframes loading2 {
0% {
transform: translate(0, 0);
}
100% {
transform: translate(12px, 0);
}
}
/* POWER ANIM */
#power .btn {
width: 32px;
height: 32px;
transition: transform 0.3s ease 0s;
}
#power .btn.off {
transform: rotate(-180deg);
}
#power .btn_outline {
position: absolute;
z-index: 2;
height: 100%;
}
#power .btn_outline.spinning {
animation: 1s linear 0s infinite normal none running spinning;
}
@keyframes spinning {
0% {transform: rotate(0deg);}
100% {transform: rotate(360deg);}
}
/* GLOW ANIM */
.hover_glow {
border: none;
outline: none;
cursor: pointer;
position: relative;
z-index: 0;
border-radius: 50%;
}
.hover_glow:before {
content: '';
background: linear-gradient(45deg, #ff0000, #ff7300, #fffb00, #48ff00, #00ffd5, #002bff, #7a00ff, #ff00c8, #ff0000);
position: absolute;
top: -2px;
left:-2px;
background-size: 400%;
z-index: -1;
filter: blur(5px);
width: calc(100% + 4px);
height: calc(100% + 4px);
animation: glowing 20s linear infinite;
opacity: 0;
transition: opacity .3s ease-in-out;
border-radius: 50%;
}
.hover_glow:active:after {
background: transparent;
}
.hover_glow:hover:before {
opacity: 1;
}
.hover_glow:after {
z-index: -1;
content: '';
position: absolute;
width: 100%;
height: 100%;
left: 0;
top: 0;
border-radius: 50%;
}
.hover_glow.static:before {
opacity: 1 !important;
}
@keyframes glowing {
0% {background-position: 0 0;}
50% {background-position: 400% 0;}
100% {background-position: 0 0;}
}
/* BLACKLIST */
.settings_item_header {
box-sizing: border-box;
padding-top: 4px;
padding-bottom: 8px;
font-size: 12px;
background-color: rgb(26, 36, 50);
width: 100%;
letter-spacing: 2px;
font-weight: bold;
color: rgba(255, 255, 255, 0.5);
text-transform: uppercase;
overflow: hidden;
white-space: nowrap;
text-overflow: ellipsis;
position: relative;
top: 12px;
z-index: 1;
}
.settings_item_container.list_item {
border-radius: 0;
border: none;
border-bottom: 2px solid rgba(255, 255, 255, 0.05);
padding-top: 16px;
padding-bottom: 16px;
margin-bottom: 0px;
}
.list_item_row {
box-sizing: border-box;
-webkit-box-align: center;
align-items: center;
-webkit-box-pack: justify;
justify-content: space-between;
width: 100%;
display: flex;
}
#current_page_host {
box-sizing: border-box;
font-size: 14px;
font-weight: bold;
color: rgb(255, 255, 255);
width: fit-content;
overflow: hidden;
white-space: nowrap;
text-overflow: ellipsis;
max-width: 220px;
}
.settings_text.text_input.list_input {
flex-grow: 1;
padding-left: 0;
}
.list_item_button {
border: none;
cursor: pointer;
display: flex;
-webkit-box-align: center;
align-items: center;
-webkit-box-pack: center;
justify-content: center;
background-color: transparent;
padding: 0px;
transition: background-color 0.3s ease 0s, color 0.3s ease 0s, transform 0.1s ease-out 0s, opacity 0.3s ease 0s;
opacity: 0.5;
height: 32px;
width: 32px;
margin-right: -4px;
}
.list_item_button:hover {
opacity: 1.0;
}
.list_item_button:disabled,
.list_item_button:disabled:hover {
opacity: 0.3;
cursor: unset;
}

Wyświetl plik

@ -1,873 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<link rel="stylesheet" href="popup.css">
</head>
<body>
<div id="loading_overlay">
<a href="https://nopecha.com/discord" target="_blank"></a>
<button class="nav_icon hover_glow static">
<img src="/icon/32.png" alt="NopeCHA">
</button>
</a>
<div class='loading_text'>Loading</div>
<div class='loading_text timeout'>
<div>This is taking longer than usual.</div>
<div>Please close this window and try again.</div>
<div>If the problem persists, contact us on <a style="color: #fff;" href="https://nopecha.com/discord" target="_blank">Discord</a></div>
</div>
</div>
<div id="template" class="hidden">
<div id="disabled_hosts_item" class="settings_item_container list_item">
<div class="list_item_row">
<input type="text" autocomplete="off" spellcheck="false" placeholder="Enter hostname" value="" class="settings_text text_input list_input hostname">
<button class="list_item_button remove">
<svg width="16" height="16" fill="#ffffff">
<path d="M9 2H7v12h2V2zm2.75 0l-1.5 12h1.98l1.5-12h-1.98zm-7.5 0H2.27l1.5 12h1.98L4.25 2zM0 0h16l-2 16H2L0 0z"></path>
</svg>
</button>
</div>
</div>
</div>
<div id="app_frame">
<!-- HCAPTCHA TAB -->
<div class="tab hidden" data-tab="hcaptcha">
<div class="header">
<button class="nav_icon back" data-tabtarget="main">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">hCaptcha</div>
</div>
<a href="https://nopecha.com/demo/hcaptcha" target="_blank">
<button class="nav_icon">
<svg width="16" height="16" viewBox="20 20 560 560"><path d="m374.48 524.29h74.9v74.89h-74.9z" fill="#0074bf" opacity=".502"/><path d="m299.59 524.29h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#0074bf" opacity=".702"/><path d="m149.8 524.29h74.9v74.89h-74.9z" fill="#0074bf" opacity=".502"/><g fill="#0082bf"><path d="m449.39 449.39h74.9v74.9h-74.9z" opacity=".702"/><path d="m374.48 449.39h74.9v74.9h-74.9z" opacity=".8"/><path d="m299.59 449.39h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 449.39h74.9v74.9h-74.9z" opacity=".8"/><path d="m74.89 449.39h74.9v74.9h-74.9z" opacity=".702"/></g><g fill="#008fbf"><path d="m524.29 374.48h74.89v74.9h-74.89z" opacity=".502"/><path d="m449.39 374.48h74.9v74.9h-74.9z" opacity=".8"/><path d="m374.48 374.48h74.9v74.9h-74.9zm-74.89 0h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 374.48h74.9v74.9h-74.9z"/><path d="m74.89 374.48h74.9v74.9h-74.9z" opacity=".8"/><path d="m0 374.48h74.89v74.9h-74.89z" opacity=".502"/></g><path d="m524.29 299.59h74.89v74.89h-74.89z" fill="#009dbf" opacity=".702"/><path d="m449.39 299.59h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9zm-74.89 0h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#009dbf"/><path d="m149.8 299.59h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9z" fill="#009dbf"/><path d="m0 299.59h74.89v74.89h-74.89z" fill="#009dbf" opacity=".702"/><path d="m524.29 224.7h74.89v74.89h-74.89z" fill="#00abbf" opacity=".702"/><path d="m449.39 224.7h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9zm-74.89 0h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#00abbf"/><path d="m149.8 224.7h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9z" fill="#00abbf"/><path d="m0 224.7h74.89v74.89h-74.89z" fill="#00abbf" opacity=".702"/><g fill="#00b9bf"><path d="m524.29 149.8h74.89v74.9h-74.89z" opacity=".502"/><path d="m449.39 149.8h74.9v74.9h-74.9z" opacity=".8"/><path d="m374.48 149.8h74.9v74.9h-74.9zm-74.89 0h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 149.8h74.9v74.9h-74.9z"/><path d="m74.89 149.8h74.9v74.9h-74.9z" opacity=".8"/><path d="m0 149.8h74.89v74.9h-74.89z" opacity=".502"/></g><g fill="#00c6bf"><path d="m449.39 74.89h74.9v74.9h-74.9z" opacity=".702"/><path d="m374.48 74.89h74.9v74.9h-74.9z" opacity=".8"/><path d="m299.59 74.89h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 74.89h74.9v74.9h-74.9z" opacity=".8"/><path d="m74.89 74.89h74.9v74.9h-74.9z" opacity=".702"/></g><path d="m374.48 0h74.9v74.89h-74.9z" fill="#00d4bf" opacity=".502"/><path d="m299.59 0h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#00d4bf" opacity=".702"/><path d="m149.8 0h74.9v74.89h-74.9z" fill="#00d4bf" opacity=".502"/><path d="m197.2 275.96 20.87-46.71c7.61-11.97 6.6-26.64-1.72-34.96-.28-.28-.56-.55-.86-.81-.29-.26-.59-.52-.89-.76a21.043 21.043 0 0 0 -1.92-1.37 22.68 22.68 0 0 0 -4.51-2.13c-1.58-.55-3.21-.92-4.87-1.12-1.66-.19-3.34-.2-5-.03s-3.3.51-4.88 1.04c-1.79.55-3.53 1.27-5.19 2.13a32.32 32.32 0 0 0 -4.72 3.02 32.38 32.38 0 0 0 -4.12 3.82 32 32 0 0 0 -3.37 4.48c-.98 1.59-28.57 66.66-39.2 96.62s-6.39 84.91 34.61 125.99c43.48 43.48 106.43 53.41 146.58 23.28.42-.21.84-.44 1.24-.67.41-.23.81-.48 1.2-.74.4-.25.78-.52 1.16-.8.38-.27.75-.56 1.11-.86l123.73-103.32c6.01-4.97 14.9-15.2 6.92-26.88-7.79-11.39-22.55-3.64-28.57.21l-71.21 51.78c-.33.27-.72.48-1.13.6-.42.12-.85.16-1.28.11s-.85-.19-1.22-.4c-.38-.21-.71-.5-.97-.85-1.81-2.22-2.13-8.11.71-10.44l109.16-92.64c9.43-8.49 10.74-20.84 3.1-29.3-7.45-8.29-19.29-8.04-28.8.53l-98.28 76.83c-.46.38-.99.66-1.56.82s-1.17.21-1.76.13-1.15-.27-1.66-.58c-.51-.3-.96-.7-1.3-1.18-1.94-2.18-2.69-5.89-.5-8.07l111.3-108.01c2.09-1.95 3.78-4.29 4.96-6.88 1.18-2.6 1.85-5.41 1.95-8.26s-.36-5.7-1.36-8.37c-1-2.68-2.51-5.13-4.45-7.22-.97-1.03-2.05-1.95-3.2-2.75a21.14 21.14 0 0 0 -3.69-2.05c-1.3-.55-2.65-.97-4.03-1.26-1.38-.28-2.79-.42-4.2-.41-1.44-.02-2.88.1-4.29.37a21.906 21.906 0 0 0 -7.96 3.16c-1.21.78-2.34 1.68-3.38 2.68l-113.73 106.83c-2.72 2.72-8.04 0-8.69-3.18-.06-.28-.08-.57-.07-.86s.06-.58.15-.85c.08-.28.2-.55.35-.79.15-.25.33-.48.54-.68l87.05-99.12a21.38 21.38 0 0 0 6.82-15.3c.11-5.81-2.15-11.42-6.25-15.53-4.11-4.12-9.71-6.4-15.52-6.31s-11.34 2.53-15.32 6.77l-132.01 145.95c-4.73 4.73-11.7 4.97-15.02 2.22-.51-.4-.93-.9-1.24-1.46-.32-.56-.52-1.18-.6-1.82-.08-.65-.03-1.3.14-1.92s.46-1.21.85-1.72z" fill="#fff"/></svg>
</button>
</a>
</div>
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M8 0A8 8 0 11.582 11.001h2.221a6 6 0 100-6L.582 4.999A8.003 8.003 0 018 0zM7 5l4 3-4 3-.001-2H0V7h6.999L7 5z"></path></svg>
<div class="settings_item_label bbflex">Auto-Open</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="hcaptcha_auto_open">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically opens captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M13.336 9.007l.044.085 2.58 6.193a.516.516 0 01-.91.48l-.043-.083-.728-1.747h-2.753l-.727 1.747a.516.516 0 01-.586.306l-.089-.028a.516.516 0 01-.306-.586l.028-.089 2.58-6.193a.517.517 0 01.91-.085zM4.128 1.728V4.13a5.161 5.161 0 004.13 9.187v2.095A7.226 7.226 0 014.128 1.73zm8.775 8.904l-.947 2.271h1.893l-.946-2.27zM8.258 0v8.258H6.193V0h2.065zm2.065 1.728a7.233 7.233 0 014.055 5.498h-2.094a5.162 5.162 0 00-1.962-3.097V1.728z"></path></svg>
<div class="settings_item_label bbflex">Auto-Solve</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="hcaptcha_auto_solve">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically solves captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M12.214 10l.964 3H14.5a1.5 1.5 0 010 3h-13a1.5 1.5 0 010-3h1.322l.964-3h8.428zm-1.607-5l.964 3H4.429l.964-3h5.214zM9 0l.964 3H6.036L7 0h2z"></path></svg>
<div class="settings_item_label bbflex">Delay solving</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="hcaptcha_solve_delay">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Adds a delay to avoid detection.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M16 8a8 8 0 01-8 8v-1a7 7 0 007-7h1zm-4.667 3.536l.667.707C10.976 13.33 9.562 14 8 14c-1.562 0-2.976-.671-4-1.757l.667-.707C5.52 12.441 6.698 13 8 13s2.48-.56 3.333-1.464zM7 4.5a.5.5 0 01.492.41L7.5 5v3.5H10a.5.5 0 01.492.41L10.5 9a.5.5 0 01-.41.492L10 9.5H6.5V5a.5.5 0 01.5-.5zM8 0v1a7 7 0 00-7 7H0a8 8 0 018-8zm0 2c1.562 0 2.977.672 4 1.758l-.666.707C10.48 3.56 9.302 3 8 3s-2.48.56-3.334 1.465L4 3.758C5.023 2.672 6.438 2 8 2z"></path></svg>
<div class="settings_item_label bbflex">Delay Timer</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Milliseconds to delay solving.</div>
<input type="number" autocomplete="off" spellcheck="false" placeholder="Delay" value="" class="settings_text text_input text_right small" data-settings="hcaptcha_solve_delay_time">
</div>
</div>
</div>
</div>
</div>
</div>
<!-- RECAPTCHA TAB -->
<div class="tab hidden" data-tab="recaptcha">
<div class="header">
<button class="nav_icon back" data-tabtarget="main">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">reCAPTCHA</div>
</div>
<a href="https://nopecha.com/demo/recaptcha" target="_blank">
<button class="nav_icon">
<svg width="16" height="16" viewBox="0 0 64 64"><path d="M64 31.955l-.033-1.37V4.687l-7.16 7.16C50.948 4.674 42.033.093 32.05.093c-10.4 0-19.622 4.96-25.458 12.64l11.736 11.86a15.55 15.55 0 0 1 4.754-5.334c2.05-1.6 4.952-2.906 8.968-2.906.485 0 .86.057 1.135.163 4.976.393 9.288 3.14 11.828 7.124l-8.307 8.307L64 31.953" fill="#1c3aa9"/><path d="M31.862.094l-1.37.033H4.594l7.16 7.16C4.58 13.147 0 22.06 0 32.046c0 10.4 4.96 19.622 12.64 25.458L24.5 45.768a15.55 15.55 0 0 1-5.334-4.754c-1.6-2.05-2.906-4.952-2.906-8.968 0-.485.057-.86.163-1.135.393-4.976 3.14-9.288 7.124-11.828l8.307 8.307L31.86.095" fill="#4285f4"/><path d="M.001 32.045l.033 1.37v25.898l7.16-7.16c5.86 7.173 14.774 11.754 24.76 11.754 10.4 0 19.622-4.96 25.458-12.64l-11.736-11.86a15.55 15.55 0 0 1-4.754 5.334c-2.05 1.6-4.952 2.906-8.968 2.906-.485 0-.86-.057-1.135-.163-4.976-.393-9.288-3.14-11.828-7.124l8.307-8.307c-10.522.04-22.4.066-27.295-.005" fill="#ababab"/></svg>
</button>
</a>
</div>
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M8 0A8 8 0 11.582 11.001h2.221a6 6 0 100-6L.582 4.999A8.003 8.003 0 018 0zM7 5l4 3-4 3-.001-2H0V7h6.999L7 5z"></path></svg>
<div class="settings_item_label bbflex">Auto-Open</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="recaptcha_auto_open">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically opens captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M13.336 9.007l.044.085 2.58 6.193a.516.516 0 01-.91.48l-.043-.083-.728-1.747h-2.753l-.727 1.747a.516.516 0 01-.586.306l-.089-.028a.516.516 0 01-.306-.586l.028-.089 2.58-6.193a.517.517 0 01.91-.085zM4.128 1.728V4.13a5.161 5.161 0 004.13 9.187v2.095A7.226 7.226 0 014.128 1.73zm8.775 8.904l-.947 2.271h1.893l-.946-2.27zM8.258 0v8.258H6.193V0h2.065zm2.065 1.728a7.233 7.233 0 014.055 5.498h-2.094a5.162 5.162 0 00-1.962-3.097V1.728z"></path></svg>
<div class="settings_item_label bbflex">Auto-Solve</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="recaptcha_auto_solve">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically solves captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M12.214 10l.964 3H14.5a1.5 1.5 0 010 3h-13a1.5 1.5 0 010-3h1.322l.964-3h8.428zm-1.607-5l.964 3H4.429l.964-3h5.214zM9 0l.964 3H6.036L7 0h2z"></path></svg>
<div class="settings_item_label bbflex">Delay solving</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="recaptcha_solve_delay">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Adds a delay to avoid detection.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M16 8a8 8 0 01-8 8v-1a7 7 0 007-7h1zm-4.667 3.536l.667.707C10.976 13.33 9.562 14 8 14c-1.562 0-2.976-.671-4-1.757l.667-.707C5.52 12.441 6.698 13 8 13s2.48-.56 3.333-1.464zM7 4.5a.5.5 0 01.492.41L7.5 5v3.5H10a.5.5 0 01.492.41L10.5 9a.5.5 0 01-.41.492L10 9.5H6.5V5a.5.5 0 01.5-.5zM8 0v1a7 7 0 00-7 7H0a8 8 0 018-8zm0 2c1.562 0 2.977.672 4 1.758l-.666.707C10.48 3.56 9.302 3 8 3s-2.48.56-3.334 1.465L4 3.758C5.023 2.672 6.438 2 8 2z"></path></svg>
<div class="settings_item_label bbflex">Delay Timer</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Milliseconds to delay solving.</div>
<input type="number" autocomplete="off" spellcheck="false" placeholder="Delay" value="" class="settings_text text_input text_right small" data-settings="recaptcha_solve_delay_time">
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M10.833 0C12.03 0 13 .895 13 2v2h-2V2H2v12h2v2H2.167C.97 16 0 15.105 0 14V2C0 .895.97 0 2.167 0h8.666zM9.5 5a4.5 4.5 0 013.81 6.895l2.397 2.398a1 1 0 01-1.32 1.497l-.094-.083-2.398-2.396A4.5 4.5 0 119.5 5zm0 2a2.5 2.5 0 100 5 2.5 2.5 0 000-5z"></path></svg>
<div class="settings_item_label bbflex">Solve Method</div>
</div>
<div class="bbflex">
<div class="settings_dropdown_selected bbflex">
<div id="recaptcha_solve_method">Image</div>
<div class="settings_dropdown_options" style="box-sizing: border-box;">
<div>
<div data-displays="#recaptcha_solve_method" data-value="Image" class="settings_dropdown bbflex" data-settings="recaptcha_solve_method">Image</div>
<div data-displays="#recaptcha_solve_method" data-value="Speech" class="settings_dropdown bbflex" data-settings="recaptcha_solve_method">Speech</div>
</div>
</div>
<svg width="16" height="16" fill="rgba(255, 255, 255, 0.5)"><path d="M5.302 9.225L8 11.878l2.7-2.653a.77.77 0 011.079 0 .744.744 0 010 1.06L8 14l-3.778-3.714a.744.744 0 010-1.061.77.77 0 011.08 0zM7.999 2l3.783 3.715.009.009a.744.744 0 01-.01 1.051.77.77 0 01-1.078 0L8.004 4.122 5.306 6.775a.77.77 0 01-1.088-.008.745.745 0 01.008-1.053L7.999 2z" fill-rule="evenodd"></path></svg>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Method used to solve the captcha.</div>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- FUNCAPTCHA TAB -->
<div class="tab hidden" data-tab="funcaptcha">
<div class="header">
<button class="nav_icon back" data-tabtarget="main">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">FunCAPTCHA</div>
</div>
<a href="https://nopecha.com/demo/funcaptcha" target="_blank">
<button class="nav_icon">
<svg width="16" height="16" viewBox="18 30 37 34"><path d="M52.107,37.991,38.249,30a3.992,3.992,0,0,0-1.919-.533A3.606,3.606,0,0,0,34.412,30L20.555,37.991a3.829,3.829,0,0,0-1.919,3.3V57.338a3.9,3.9,0,0,0,1.919,3.3l.959.533,4.423,2.558V56.326l10.393-5.969,10.393,5.969v7.355l4.423-2.558.959-.586a3.829,3.829,0,0,0,1.919-3.3V41.243A3.857,3.857,0,0,0,52.107,37.991ZM46.617,47.9,38.2,43a3.99,3.99,0,0,0-1.918-.533A3.607,3.607,0,0,0,34.359,43l-8.474,4.9V43.268l8.688-5.01a3.425,3.425,0,0,1,3.358,0l8.688,5.01Z" fill="#50b95d"/></svg>
</button>
</a>
</div>
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M8 0A8 8 0 11.582 11.001h2.221a6 6 0 100-6L.582 4.999A8.003 8.003 0 018 0zM7 5l4 3-4 3-.001-2H0V7h6.999L7 5z"></path></svg>
<div class="settings_item_label bbflex">Auto-Open</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="funcaptcha_auto_open">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically opens captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M13.336 9.007l.044.085 2.58 6.193a.516.516 0 01-.91.48l-.043-.083-.728-1.747h-2.753l-.727 1.747a.516.516 0 01-.586.306l-.089-.028a.516.516 0 01-.306-.586l.028-.089 2.58-6.193a.517.517 0 01.91-.085zM4.128 1.728V4.13a5.161 5.161 0 004.13 9.187v2.095A7.226 7.226 0 014.128 1.73zm8.775 8.904l-.947 2.271h1.893l-.946-2.27zM8.258 0v8.258H6.193V0h2.065zm2.065 1.728a7.233 7.233 0 014.055 5.498h-2.094a5.162 5.162 0 00-1.962-3.097V1.728z"></path></svg>
<div class="settings_item_label bbflex">Auto-Solve</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="funcaptcha_auto_solve">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically solves captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M12.214 10l.964 3H14.5a1.5 1.5 0 010 3h-13a1.5 1.5 0 010-3h1.322l.964-3h8.428zm-1.607-5l.964 3H4.429l.964-3h5.214zM9 0l.964 3H6.036L7 0h2z"></path></svg>
<div class="settings_item_label bbflex">Delay solving</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="funcaptcha_solve_delay">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Adds a delay to avoid detection.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M16 8a8 8 0 01-8 8v-1a7 7 0 007-7h1zm-4.667 3.536l.667.707C10.976 13.33 9.562 14 8 14c-1.562 0-2.976-.671-4-1.757l.667-.707C5.52 12.441 6.698 13 8 13s2.48-.56 3.333-1.464zM7 4.5a.5.5 0 01.492.41L7.5 5v3.5H10a.5.5 0 01.492.41L10.5 9a.5.5 0 01-.41.492L10 9.5H6.5V5a.5.5 0 01.5-.5zM8 0v1a7 7 0 00-7 7H0a8 8 0 018-8zm0 2c1.562 0 2.977.672 4 1.758l-.666.707C10.48 3.56 9.302 3 8 3s-2.48.56-3.334 1.465L4 3.758C5.023 2.672 6.438 2 8 2z"></path></svg>
<div class="settings_item_label bbflex">Delay Timer</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Milliseconds to delay solving.</div>
<input type="number" autocomplete="off" spellcheck="false" placeholder="Delay" value="" class="settings_text text_input text_right small" data-settings="funcaptcha_solve_delay_time">
</div>
</div>
</div>
</div>
</div>
</div>
<!-- AWSCAPTCHA TAB -->
<div class="tab hidden" data-tab="awscaptcha">
<div class="header">
<button class="nav_icon back" data-tabtarget="main">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">AWS CAPTCHA</div>
</div>
<a href="https://nopecha.com/demo/awscaptcha" target="_blank">
<button class="nav_icon">
<svg width="16" height="16" viewBox="0 0 256 310"><path d="M0 173.367l.985.52 15.49 1.762 19.455-1.082.856-.45-17.267-1.501-19.519.75z" fill="#B6C99C"/><path d="M128 .698L73.948 27.724V201.23L128 211.148l1.85-2.5V5.148L128 .699z" fill="#4C612C"/><path d="M128 .698v217.7l54.053-16.141V27.724L128 .698z" fill="#769B3F"/><path d="M219.214 174.117l.922.623 19.339 1.074 15.656-1.779.869-.669-19.52-.75-17.266 1.501z" fill="#B6C99C"/><path d="M219.214 210.153l20.27 2.627.543-.998v-35.397l-.543-1.141-20.27-1.126v36.035z" fill="#4C612C"/><path d="M36.786 210.153l-20.27 2.627-.342-.925v-36.001l.342-.61 20.27-1.126v36.035z" fill="#769B3F"/><path d="M125.748 208.651l-89.713-15.765-19.52 1.876.889.891 85.223 17.265.974-.513 22.147-3.754z" fill="#B6C99C"/><path d="M0 191.385v54.428L89.713 290.8v.055L128 310l1.6-3.002v-118.85l-1.6-3.746-38.287-3.753v28.888l-73.197-14.81v-19.483L0 173.367v18.018z" fill="#4C612C"/><path d="M128 209.026l21.771 3.754 2.804.118 85.285-17.129 1.624-1.007-19.144-1.877L128 209.026z" fill="#B6C99C"/><path d="M239.484 175.243v19.483l-73.196 14.811v-30.165L128 183.126V310l128-64.188v-72.446l-16.516 1.877z" fill="#769B3F"/><path d="M166.287 182.375L128 179.372l-38.288 3.003L128 186.13l38.287-3.754z" fill="#B6C99C"/></svg>
</button>
</a>
</div>
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M8 0A8 8 0 11.582 11.001h2.221a6 6 0 100-6L.582 4.999A8.003 8.003 0 018 0zM7 5l4 3-4 3-.001-2H0V7h6.999L7 5z"></path></svg>
<div class="settings_item_label bbflex">Auto-Open</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="awscaptcha_auto_open">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically opens captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M13.336 9.007l.044.085 2.58 6.193a.516.516 0 01-.91.48l-.043-.083-.728-1.747h-2.753l-.727 1.747a.516.516 0 01-.586.306l-.089-.028a.516.516 0 01-.306-.586l.028-.089 2.58-6.193a.517.517 0 01.91-.085zM4.128 1.728V4.13a5.161 5.161 0 004.13 9.187v2.095A7.226 7.226 0 014.128 1.73zm8.775 8.904l-.947 2.271h1.893l-.946-2.27zM8.258 0v8.258H6.193V0h2.065zm2.065 1.728a7.233 7.233 0 014.055 5.498h-2.094a5.162 5.162 0 00-1.962-3.097V1.728z"></path></svg>
<div class="settings_item_label bbflex">Auto-Solve</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="awscaptcha_auto_solve">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically solves captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M12.214 10l.964 3H14.5a1.5 1.5 0 010 3h-13a1.5 1.5 0 010-3h1.322l.964-3h8.428zm-1.607-5l.964 3H4.429l.964-3h5.214zM9 0l.964 3H6.036L7 0h2z"></path></svg>
<div class="settings_item_label bbflex">Delay solving</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="awscaptcha_solve_delay">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Adds a delay to avoid detection.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M16 8a8 8 0 01-8 8v-1a7 7 0 007-7h1zm-4.667 3.536l.667.707C10.976 13.33 9.562 14 8 14c-1.562 0-2.976-.671-4-1.757l.667-.707C5.52 12.441 6.698 13 8 13s2.48-.56 3.333-1.464zM7 4.5a.5.5 0 01.492.41L7.5 5v3.5H10a.5.5 0 01.492.41L10.5 9a.5.5 0 01-.41.492L10 9.5H6.5V5a.5.5 0 01.5-.5zM8 0v1a7 7 0 00-7 7H0a8 8 0 018-8zm0 2c1.562 0 2.977.672 4 1.758l-.666.707C10.48 3.56 9.302 3 8 3s-2.48.56-3.334 1.465L4 3.758C5.023 2.672 6.438 2 8 2z"></path></svg>
<div class="settings_item_label bbflex">Delay Timer</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Milliseconds to delay solving.</div>
<input type="number" autocomplete="off" spellcheck="false" placeholder="Delay" value="" class="settings_text text_input text_right small" data-settings="awscaptcha_solve_delay_time">
</div>
</div>
</div>
</div>
</div>
</div>
<!-- TEXTCAPTCHA TAB -->
<div class="tab hidden" data-tab="textcaptcha">
<div class="header">
<button class="nav_icon back" data-tabtarget="main">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">Text CAPTCHA</div>
</div>
<a href="https://nopecha.com/demo/textcaptcha" target="_blank">
<button class="nav_icon">
<svg width="16" height="16" viewBox="0 0 100 100"><g transform="translate(0,51) scale(0.01,-0.01)"><path fill="#ffffff" d="M7287.1,5007c-697.7-46.8-1170-242.5-1467.8-608.4c-227.6-285.1-274.4-463.7-170.2-655.2c89.3-159.5,185.1-202.1,574.4-259.5c234-34,310.6-14.9,425.4,114.9c331.8,370.1,385,399.9,757.3,414.8c393.6,19.1,689.2-59.6,804.1-212.7c72.3-95.7,119.1-274.4,119.1-451v-151l-174.5-55.3c-251-80.8-461.6-131.9-989.2-240.4c-908.3-185.1-1240.2-321.2-1516.7-619c-231.9-251-340.4-551-340.4-933.9C5308.8,661,5713,148.3,6404.3-34.6c119.1-31.9,223.4-40.4,521.2-38.3c342.5,0,387.2,6.4,559.5,57.4c310.6,95.7,604.1,261.7,838.1,470.1l110.6,100l44.7-144.7c59.6-193.6,125.5-289.3,238.3-344.6c80.8-38.3,131.9-44.7,363.7-44.7c363.8,0,472.3,46.8,572.2,242.5c48.9,95.7,44.7,202.1-17,502c-34.1,161.7-38.3,321.2-40.4,1531.6c-2.1,1159.4-6.4,1378.5-38.3,1552.9c-57.4,338.2-112.7,459.5-287.2,642.4c-259.5,272.3-597.8,423.3-1089.2,482.9C7974.2,4998.5,7459.4,5017.6,7287.1,5007z M8325.2,1962.9c-10.6-346.7-17-416.9-59.6-523.3c-97.9-255.3-353.1-482.9-642.4-570.1c-202.1-61.7-489.3-51.1-644.5,23.4c-134,65.9-263.8,195.7-329.7,329.7c-76.6,159.5-74.5,389.3,6.4,525.4c117,200,302.1,282.9,950.9,429.7c240.4,55.3,493.5,117,563.7,138.3c70.2,23.4,136.2,42.5,146.8,42.5C8327.4,2360.7,8329.5,2186.3,8325.2,1962.9z"/><path fill="#ffffff" d="M2909.2,1996.9c-38.3-12.8-104.2-57.4-148.9-100c-72.3-72.3-187.2-359.5-1248.7-3097.3C869.2-2861.7,333.1-4252.9,322.5-4295.5c-40.4-161.7,68.1-378.7,229.7-451c74.5-34,140.4-40.4,448.9-40.4c410.6,0,491.4,21.3,591.4,153.2c34,44.7,148.9,342.5,299.9,772.2l242.5,702h1346.6h1348.7l217-585c119.1-321.2,236.1-640.3,261.6-706.2c55.3-153.2,131.9-244.6,244.7-295.7c117-53.2,776.4-59.6,899.8-8.5c157.4,65.9,259.5,217,259.5,380.8c0,76.6-244.6,708.4-1216.8,3144.1c-1327.4,3331.3-1257.2,3171.7-1452.9,3227C3938.8,2024.6,3009.2,2024.6,2909.2,1996.9z M3945.2-851.5l444.6-1201.9l-906.2-6.4c-497.8-2.1-908.3,0-912.6,4.3c-6.4,6.4,782.8,2216.6,878.6,2459.1C3466.6,446.2,3441.1,514.2,3945.2-851.5z"/></g></svg>
</button>
</a>
</div>
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M13.336 9.007l.044.085 2.58 6.193a.516.516 0 01-.91.48l-.043-.083-.728-1.747h-2.753l-.727 1.747a.516.516 0 01-.586.306l-.089-.028a.516.516 0 01-.306-.586l.028-.089 2.58-6.193a.517.517 0 01.91-.085zM4.128 1.728V4.13a5.161 5.161 0 004.13 9.187v2.095A7.226 7.226 0 014.128 1.73zm8.775 8.904l-.947 2.271h1.893l-.946-2.27zM8.258 0v8.258H6.193V0h2.065zm2.065 1.728a7.233 7.233 0 014.055 5.498h-2.094a5.162 5.162 0 00-1.962-3.097V1.728z"></path></svg>
<div class="settings_item_label bbflex">Auto-Solve</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="textcaptcha_auto_solve">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Automatically solves captcha challenges.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M12.214 10l.964 3H14.5a1.5 1.5 0 010 3h-13a1.5 1.5 0 010-3h1.322l.964-3h8.428zm-1.607-5l.964 3H4.429l.964-3h5.214zM9 0l.964 3H6.036L7 0h2z"></path></svg>
<div class="settings_item_label bbflex">Delay solving</div>
</div>
<div class="bbflex">
<div class="settings_toggle off" data-settings="textcaptcha_solve_delay">
<div></div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Adds a delay to avoid detection.</div>
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M16 8a8 8 0 01-8 8v-1a7 7 0 007-7h1zm-4.667 3.536l.667.707C10.976 13.33 9.562 14 8 14c-1.562 0-2.976-.671-4-1.757l.667-.707C5.52 12.441 6.698 13 8 13s2.48-.56 3.333-1.464zM7 4.5a.5.5 0 01.492.41L7.5 5v3.5H10a.5.5 0 01.492.41L10.5 9a.5.5 0 01-.41.492L10 9.5H6.5V5a.5.5 0 01.5-.5zM8 0v1a7 7 0 00-7 7H0a8 8 0 018-8zm0 2c1.562 0 2.977.672 4 1.758l-.666.707C10.48 3.56 9.302 3 8 3s-2.48.56-3.334 1.465L4 3.758C5.023 2.672 6.438 2 8 2z"></path></svg>
<div class="settings_item_label bbflex">Delay Timer</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">Milliseconds to delay solving.</div>
<input type="number" autocomplete="off" spellcheck="false" placeholder="Delay" value="" class="settings_text text_input text_right small" data-settings="textcaptcha_solve_delay_time">
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M10.833 0C12.03 0 13 .895 13 2v2h-2V2H2v12h2v2H2.167C.97 16 0 15.105 0 14V2C0 .895.97 0 2.167 0h8.666zM9.5 5a4.5 4.5 0 013.81 6.895l2.397 2.398a1 1 0 01-1.32 1.497l-.094-.083-2.398-2.396A4.5 4.5 0 119.5 5zm0 2a2.5 2.5 0 100 5 2.5 2.5 0 000-5z"></path></svg>
<div class="settings_item_label bbflex">Image Element</div>
</div>
<div class="bbflex">
<div class="settings_button bbflex">
<div class="locate" data-key="textcaptcha_image_selector">Locate</div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">CSS selector for the captcha image.</div>
<input type="text" autocomplete="off" spellcheck="false" placeholder="Enter CSS selector" value="" class="settings_text text_input text_right" data-settings="textcaptcha_image_selector">
</div>
</div>
<div class="settings_item_container">
<div class="settings_item">
<div class="bbflex">
<div class="bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M10.833 0C12.03 0 13 .895 13 2v2h-2V2H2v12h2v2H2.167C.97 16 0 15.105 0 14V2C0 .895.97 0 2.167 0h8.666zM9.5 5a4.5 4.5 0 013.81 6.895l2.397 2.398a1 1 0 01-1.32 1.497l-.094-.083-2.398-2.396A4.5 4.5 0 119.5 5zm0 2a2.5 2.5 0 100 5 2.5 2.5 0 000-5z"></path></svg>
<div class="settings_item_label bbflex">Input Element</div>
</div>
<div class="bbflex">
<div class="settings_button bbflex">
<div class="locate" data-key="textcaptcha_input_selector">Locate</div>
</div>
</div>
</div>
</div>
<div class="settings_description_container bbflex">
<div class="settings_description">CSS selector for the captcha input.</div>
<input type="text" autocomplete="off" spellcheck="false" placeholder="Enter CSS selector" value="" class="settings_text text_input text_right" data-settings="textcaptcha_input_selector">
</div>
</div>
</div>
</div>
</div>
</div>
<!-- MAIN TAB -->
<div class="tab" data-tab="main">
<div class="header">
<!-- <a href="https://nopecha.com/discord" target="_blank">
<button class="nav_icon hover_glow">
<svg width="18" height="18" viewBox="0 0 71 55"><path d="M60.1045 4.8978C55.5792 2.8214 50.7265 1.2916 45.6527 0.41542C45.5603 0.39851 45.468 0.440769 45.4204 0.525289C44.7963 1.6353 44.105 3.0834 43.6209 4.2216C38.1637 3.4046 32.7345 3.4046 27.3892 4.2216C26.905 3.0581 26.1886 1.6353 25.5617 0.525289C25.5141 0.443589 25.4218 0.40133 25.3294 0.41542C20.2584 1.2888 15.4057 2.8186 10.8776 4.8978C10.8384 4.9147 10.8048 4.9429 10.7825 4.9795C1.57795 18.7309 -0.943561 32.1443 0.293408 45.3914C0.299005 45.4562 0.335386 45.5182 0.385761 45.5576C6.45866 50.0174 12.3413 52.7249 18.1147 54.5195C18.2071 54.5477 18.305 54.5139 18.3638 54.4378C19.7295 52.5728 20.9469 50.6063 21.9907 48.5383C22.0523 48.4172 21.9935 48.2735 21.8676 48.2256C19.9366 47.4931 18.0979 46.6 16.3292 45.5858C16.1893 45.5041 16.1781 45.304 16.3068 45.2082C16.679 44.9293 17.0513 44.6391 17.4067 44.3461C17.471 44.2926 17.5606 44.2813 17.6362 44.3151C29.2558 49.6202 41.8354 49.6202 53.3179 44.3151C53.3935 44.2785 53.4831 44.2898 53.5502 44.3433C53.9057 44.6363 54.2779 44.9293 54.6529 45.2082C54.7816 45.304 54.7732 45.5041 54.6333 45.5858C52.8646 46.6197 51.0259 47.4931 49.0921 48.2228C48.9662 48.2707 48.9102 48.4172 48.9718 48.5383C50.038 50.6034 51.2554 52.5699 52.5959 54.435C52.6519 54.5139 52.7526 54.5477 52.845 54.5195C58.6464 52.7249 64.529 50.0174 70.6019 45.5576C70.6551 45.5182 70.6887 45.459 70.6943 45.3942C72.1747 30.0791 68.2147 16.7757 60.1968 4.9823C60.1772 4.9429 60.1437 4.9147 60.1045 4.8978ZM23.7259 37.3253C20.2276 37.3253 17.3451 34.1136 17.3451 30.1693C17.3451 26.225 20.1717 23.0133 23.7259 23.0133C27.308 23.0133 30.1626 26.2532 30.1066 30.1693C30.1066 34.1136 27.28 37.3253 23.7259 37.3253ZM47.3178 37.3253C43.8196 37.3253 40.9371 34.1136 40.9371 30.1693C40.9371 26.225 43.7636 23.0133 47.3178 23.0133C50.9 23.0133 53.7545 26.2532 53.6986 30.1693C53.6986 34.1136 50.9 37.3253 47.3178 37.3253Z"/></svg>
</button>
</a> -->
<div data-tabtarget="settings">
<button class="nav_icon">
<svg width="12" height="12" viewBox="0 0 16 16"><path fill-rule="evenodd" d="M1 0h14a1 1 0 010 2H1a1 1 0 110-2zm0 6h14a1 1 0 010 2H1a1 1 0 110-2zm0 6h14a1 1 0 010 2H1a1 1 0 010-2z" opacity="0.603"></path></svg>
</button>
</div>
<div class="header_label_container">
<a href="https://nopecha.com" target="_blank">
<div class="header_label">NopeCHA</div>
</a>
</div>
<button id='power' class="nav_icon">
<div style="width: 32px; height: 32px;">
<div class="btn">
<svg width="32" height="32" viewBox="0 0 72 72"><path d="M31.5 18.57a18 18 0 109-.01v3.13a15 15 0 11-9 0v-3.12zm3-5.07v22a1.5 1.5 0 003 0v-22a1.5 1.5 0 00-3 0zM36 66a30 30 0 110-60 30 30 0 010 60z" fill="#ffffff"></path></svg>
</div>
</div>
<div class="btn_outline spinning hidden">
<svg width="32" height="32" viewBox="0 0 72 72"><path fill="#A0FEDA" fill-rule="evenodd" d="M.055 38H3.06C4.093 55.294 18.446 69 36 69s31.907-13.706 32.94-31h3.005C70.908 56.952 55.211 72 36 72S1.092 56.952.055 38zm0-4C1.092 15.048 16.789 0 36 0s34.908 15.048 35.945 34H68.94C67.907 16.706 53.554 3 36 3S4.093 16.706 3.06 34H.055z"></path></svg>
</div>
<div class="btn_outline static hidden">
<svg width="32" height="32" viewBox="0 0 72 72" fill="#55ff8a"><path d="M36 72C16.118 72 0 55.882 0 36S16.118 0 36 0s36 16.118 36 36-16.118 36-36 36zm0-3c18.225 0 33-14.775 33-33S54.225 3 36 3 3 17.775 3 36s14.775 33 33 33z"></path></svg>
</div>
</button>
</div>
<div class="plan_info_container" style="border-bottom: 1px solid rgba(255, 255, 255, 0.4);">
<input type="text" autocomplete="off" spellcheck="false" placeholder="Enter subscription key" value="" class="settings_text text_input plan_info hiddenleft" data-settings="key" style="padding: 2px 8px 2px 16px">
<div id="edit_key" class="plan_info">
<div class="plan_label clickable key_label">
<div>Subscription Key</div>
<div class="instructions">(Click to enter)</div>
</div>
<div class="plan_value">
<button class="plan_button edit_icon hidden clickable">
<!-- <div class="edit_key"></div> -->
<svg width="16" height="16" fill="#ffffff"><path fill-rule="evenodd" d="M11 0l.217.005a5 5 0 014.778 4.772L16 5 5 16H0v-5L11 0zM2 11.828V14h2.172l9.737-9.737a3.009 3.009 0 00-1.983-2.125l-.184-.052L2 11.828z"></path></svg>
</button>
</div>
</div>
</div>
<br>
<div class="plan_info_box">
<div id="ipbanned_warning" class="warning_box hidden">
<div>Your IP is ineligible for free credits.</div>
<div><a href="https://nopecha.com/pricing" target="_blank">Purchase a key</a> to use with VPN/proxy.</div>
</div>
<div class="plan_info_container">
<div class="plan_info">
<div id="plan" class="plan_label">Free Plan</div>
<div class="plan_value">
<a href="https://nopecha.com/manage" target="_blank">
<button class="plan_button link">
<div class="plan_button_label clickable">Upgrade</div>
</button>
</a>
</div>
</div>
</div>
<div class="plan_info_container">
<div class="plan_info">
<div class="plan_label">Credits</div>
<div class="plan_value">
<a href="https://nopecha.com/manage" target="_blank">
<button class="plan_button link">
<div id="credit" class="plan_button_label clickable">0 / 0</div>
</button>
</a>
</div>
</div>
</div>
<div class="plan_info_container">
<div class="plan_info">
<div class="plan_label">Refills</div>
<div class="plan_value">
<a href="https://nopecha.com/manage" target="_blank">
<button class="plan_button link">
<div id="refills" class="plan_button_label clickable">00:00:00</div>
</button>
</a>
</div>
</div>
</div>
</div>
<div class="menu">
<button class="menu_item_container" data-tabtarget="hcaptcha">
<div class="button_label_container">
<svg width="16" height="16" viewBox="20 20 560 560"><path d="m374.48 524.29h74.9v74.89h-74.9z" fill="#0074bf" opacity=".502"/><path d="m299.59 524.29h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#0074bf" opacity=".702"/><path d="m149.8 524.29h74.9v74.89h-74.9z" fill="#0074bf" opacity=".502"/><g fill="#0082bf"><path d="m449.39 449.39h74.9v74.9h-74.9z" opacity=".702"/><path d="m374.48 449.39h74.9v74.9h-74.9z" opacity=".8"/><path d="m299.59 449.39h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 449.39h74.9v74.9h-74.9z" opacity=".8"/><path d="m74.89 449.39h74.9v74.9h-74.9z" opacity=".702"/></g><g fill="#008fbf"><path d="m524.29 374.48h74.89v74.9h-74.89z" opacity=".502"/><path d="m449.39 374.48h74.9v74.9h-74.9z" opacity=".8"/><path d="m374.48 374.48h74.9v74.9h-74.9zm-74.89 0h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 374.48h74.9v74.9h-74.9z"/><path d="m74.89 374.48h74.9v74.9h-74.9z" opacity=".8"/><path d="m0 374.48h74.89v74.9h-74.89z" opacity=".502"/></g><path d="m524.29 299.59h74.89v74.89h-74.89z" fill="#009dbf" opacity=".702"/><path d="m449.39 299.59h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9zm-74.89 0h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#009dbf"/><path d="m149.8 299.59h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9z" fill="#009dbf"/><path d="m0 299.59h74.89v74.89h-74.89z" fill="#009dbf" opacity=".702"/><path d="m524.29 224.7h74.89v74.89h-74.89z" fill="#00abbf" opacity=".702"/><path d="m449.39 224.7h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9zm-74.89 0h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#00abbf"/><path d="m149.8 224.7h74.9v74.89h-74.9zm-74.91 0h74.9v74.89h-74.9z" fill="#00abbf"/><path d="m0 224.7h74.89v74.89h-74.89z" fill="#00abbf" opacity=".702"/><g fill="#00b9bf"><path d="m524.29 149.8h74.89v74.9h-74.89z" opacity=".502"/><path d="m449.39 149.8h74.9v74.9h-74.9z" opacity=".8"/><path d="m374.48 149.8h74.9v74.9h-74.9zm-74.89 0h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 149.8h74.9v74.9h-74.9z"/><path d="m74.89 149.8h74.9v74.9h-74.9z" opacity=".8"/><path d="m0 149.8h74.89v74.9h-74.89z" opacity=".502"/></g><g fill="#00c6bf"><path d="m449.39 74.89h74.9v74.9h-74.9z" opacity=".702"/><path d="m374.48 74.89h74.9v74.9h-74.9z" opacity=".8"/><path d="m299.59 74.89h74.89v74.9h-74.89zm-74.89 0h74.89v74.9h-74.89z"/><path d="m149.8 74.89h74.9v74.9h-74.9z" opacity=".8"/><path d="m74.89 74.89h74.9v74.9h-74.9z" opacity=".702"/></g><path d="m374.48 0h74.9v74.89h-74.9z" fill="#00d4bf" opacity=".502"/><path d="m299.59 0h74.89v74.89h-74.89zm-74.89 0h74.89v74.89h-74.89z" fill="#00d4bf" opacity=".702"/><path d="m149.8 0h74.9v74.89h-74.9z" fill="#00d4bf" opacity=".502"/><path d="m197.2 275.96 20.87-46.71c7.61-11.97 6.6-26.64-1.72-34.96-.28-.28-.56-.55-.86-.81-.29-.26-.59-.52-.89-.76a21.043 21.043 0 0 0 -1.92-1.37 22.68 22.68 0 0 0 -4.51-2.13c-1.58-.55-3.21-.92-4.87-1.12-1.66-.19-3.34-.2-5-.03s-3.3.51-4.88 1.04c-1.79.55-3.53 1.27-5.19 2.13a32.32 32.32 0 0 0 -4.72 3.02 32.38 32.38 0 0 0 -4.12 3.82 32 32 0 0 0 -3.37 4.48c-.98 1.59-28.57 66.66-39.2 96.62s-6.39 84.91 34.61 125.99c43.48 43.48 106.43 53.41 146.58 23.28.42-.21.84-.44 1.24-.67.41-.23.81-.48 1.2-.74.4-.25.78-.52 1.16-.8.38-.27.75-.56 1.11-.86l123.73-103.32c6.01-4.97 14.9-15.2 6.92-26.88-7.79-11.39-22.55-3.64-28.57.21l-71.21 51.78c-.33.27-.72.48-1.13.6-.42.12-.85.16-1.28.11s-.85-.19-1.22-.4c-.38-.21-.71-.5-.97-.85-1.81-2.22-2.13-8.11.71-10.44l109.16-92.64c9.43-8.49 10.74-20.84 3.1-29.3-7.45-8.29-19.29-8.04-28.8.53l-98.28 76.83c-.46.38-.99.66-1.56.82s-1.17.21-1.76.13-1.15-.27-1.66-.58c-.51-.3-.96-.7-1.3-1.18-1.94-2.18-2.69-5.89-.5-8.07l111.3-108.01c2.09-1.95 3.78-4.29 4.96-6.88 1.18-2.6 1.85-5.41 1.95-8.26s-.36-5.7-1.36-8.37c-1-2.68-2.51-5.13-4.45-7.22-.97-1.03-2.05-1.95-3.2-2.75a21.14 21.14 0 0 0 -3.69-2.05c-1.3-.55-2.65-.97-4.03-1.26-1.38-.28-2.79-.42-4.2-.41-1.44-.02-2.88.1-4.29.37a21.906 21.906 0 0 0 -7.96 3.16c-1.21.78-2.34 1.68-3.38 2.68l-113.73 106.83c-2.72 2.72-8.04 0-8.69-3.18-.06-.28-.08-.57-.07-.86s.06-.58.15-.85c.08-.28.2-.55.35-.79.15-.25.33-.48.54-.68l87.05-99.12a21.38 21.38 0 0 0 6.82-15.3c.11-5.81-2.15-11.42-6.25-15.53-4.11-4.12-9.71-6.4-15.52-6.31s-11.34 2.53-15.32 6.77l-132.01 145.95c-4.73 4.73-11.7 4.97-15.02 2.22-.51-.4-.93-.9-1.24-1.46-.32-.56-.52-1.18-.6-1.82-.08-.65-.03-1.3.14-1.92s.46-1.21.85-1.72z" fill="#fff"/></svg>
<div class="button_label">hCaptcha</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
<button class="menu_item_container" data-tabtarget="recaptcha">
<div class="button_label_container">
<svg width="16" height="16" viewBox="0 0 70 70"><path d="M64 31.955l-.033-1.37V4.687l-7.16 7.16C50.948 4.674 42.033.093 32.05.093c-10.4 0-19.622 4.96-25.458 12.64l11.736 11.86a15.55 15.55 0 0 1 4.754-5.334c2.05-1.6 4.952-2.906 8.968-2.906.485 0 .86.057 1.135.163 4.976.393 9.288 3.14 11.828 7.124l-8.307 8.307L64 31.953" fill="#1c3aa9"/><path d="M31.862.094l-1.37.033H4.594l7.16 7.16C4.58 13.147 0 22.06 0 32.046c0 10.4 4.96 19.622 12.64 25.458L24.5 45.768a15.55 15.55 0 0 1-5.334-4.754c-1.6-2.05-2.906-4.952-2.906-8.968 0-.485.057-.86.163-1.135.393-4.976 3.14-9.288 7.124-11.828l8.307 8.307L31.86.095" fill="#4285f4"/><path d="M.001 32.045l.033 1.37v25.898l7.16-7.16c5.86 7.173 14.774 11.754 24.76 11.754 10.4 0 19.622-4.96 25.458-12.64l-11.736-11.86a15.55 15.55 0 0 1-4.754 5.334c-2.05 1.6-4.952 2.906-8.968 2.906-.485 0-.86-.057-1.135-.163-4.976-.393-9.288-3.14-11.828-7.124l8.307-8.307c-10.522.04-22.4.066-27.295-.005" fill="#ababab"/></svg>
<div class="button_label">reCAPTCHA</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
<button class="menu_item_container" data-tabtarget="funcaptcha">
<div class="button_label_container">
<svg width="16" height="16" viewBox="18 30 37 34"><path d="M52.107,37.991,38.249,30a3.992,3.992,0,0,0-1.919-.533A3.606,3.606,0,0,0,34.412,30L20.555,37.991a3.829,3.829,0,0,0-1.919,3.3V57.338a3.9,3.9,0,0,0,1.919,3.3l.959.533,4.423,2.558V56.326l10.393-5.969,10.393,5.969v7.355l4.423-2.558.959-.586a3.829,3.829,0,0,0,1.919-3.3V41.243A3.857,3.857,0,0,0,52.107,37.991ZM46.617,47.9,38.2,43a3.99,3.99,0,0,0-1.918-.533A3.607,3.607,0,0,0,34.359,43l-8.474,4.9V43.268l8.688-5.01a3.425,3.425,0,0,1,3.358,0l8.688,5.01Z" fill="#50b95d"/></svg>
<div class="button_label">FunCAPTCHA</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
<button class="menu_item_container" data-tabtarget="awscaptcha">
<div class="button_label_container">
<svg width="16" height="16" viewBox="0 0 256 310"><path d="M0 173.367l.985.52 15.49 1.762 19.455-1.082.856-.45-17.267-1.501-19.519.75z" fill="#B6C99C"/><path d="M128 .698L73.948 27.724V201.23L128 211.148l1.85-2.5V5.148L128 .699z" fill="#4C612C"/><path d="M128 .698v217.7l54.053-16.141V27.724L128 .698z" fill="#769B3F"/><path d="M219.214 174.117l.922.623 19.339 1.074 15.656-1.779.869-.669-19.52-.75-17.266 1.501z" fill="#B6C99C"/><path d="M219.214 210.153l20.27 2.627.543-.998v-35.397l-.543-1.141-20.27-1.126v36.035z" fill="#4C612C"/><path d="M36.786 210.153l-20.27 2.627-.342-.925v-36.001l.342-.61 20.27-1.126v36.035z" fill="#769B3F"/><path d="M125.748 208.651l-89.713-15.765-19.52 1.876.889.891 85.223 17.265.974-.513 22.147-3.754z" fill="#B6C99C"/><path d="M0 191.385v54.428L89.713 290.8v.055L128 310l1.6-3.002v-118.85l-1.6-3.746-38.287-3.753v28.888l-73.197-14.81v-19.483L0 173.367v18.018z" fill="#4C612C"/><path d="M128 209.026l21.771 3.754 2.804.118 85.285-17.129 1.624-1.007-19.144-1.877L128 209.026z" fill="#B6C99C"/><path d="M239.484 175.243v19.483l-73.196 14.811v-30.165L128 183.126V310l128-64.188v-72.446l-16.516 1.877z" fill="#769B3F"/><path d="M166.287 182.375L128 179.372l-38.288 3.003L128 186.13l38.287-3.754z" fill="#B6C99C"/></svg>
<div class="button_label">AWS CAPTCHA</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
<button class="menu_item_container" data-tabtarget="textcaptcha">
<div class="button_label_container">
<svg width="16" height="16" viewBox="0 0 100 100"><g transform="translate(0,51) scale(0.01,-0.01)"><path fill="#ffffff" d="M7287.1,5007c-697.7-46.8-1170-242.5-1467.8-608.4c-227.6-285.1-274.4-463.7-170.2-655.2c89.3-159.5,185.1-202.1,574.4-259.5c234-34,310.6-14.9,425.4,114.9c331.8,370.1,385,399.9,757.3,414.8c393.6,19.1,689.2-59.6,804.1-212.7c72.3-95.7,119.1-274.4,119.1-451v-151l-174.5-55.3c-251-80.8-461.6-131.9-989.2-240.4c-908.3-185.1-1240.2-321.2-1516.7-619c-231.9-251-340.4-551-340.4-933.9C5308.8,661,5713,148.3,6404.3-34.6c119.1-31.9,223.4-40.4,521.2-38.3c342.5,0,387.2,6.4,559.5,57.4c310.6,95.7,604.1,261.7,838.1,470.1l110.6,100l44.7-144.7c59.6-193.6,125.5-289.3,238.3-344.6c80.8-38.3,131.9-44.7,363.7-44.7c363.8,0,472.3,46.8,572.2,242.5c48.9,95.7,44.7,202.1-17,502c-34.1,161.7-38.3,321.2-40.4,1531.6c-2.1,1159.4-6.4,1378.5-38.3,1552.9c-57.4,338.2-112.7,459.5-287.2,642.4c-259.5,272.3-597.8,423.3-1089.2,482.9C7974.2,4998.5,7459.4,5017.6,7287.1,5007z M8325.2,1962.9c-10.6-346.7-17-416.9-59.6-523.3c-97.9-255.3-353.1-482.9-642.4-570.1c-202.1-61.7-489.3-51.1-644.5,23.4c-134,65.9-263.8,195.7-329.7,329.7c-76.6,159.5-74.5,389.3,6.4,525.4c117,200,302.1,282.9,950.9,429.7c240.4,55.3,493.5,117,563.7,138.3c70.2,23.4,136.2,42.5,146.8,42.5C8327.4,2360.7,8329.5,2186.3,8325.2,1962.9z"/><path fill="#ffffff" d="M2909.2,1996.9c-38.3-12.8-104.2-57.4-148.9-100c-72.3-72.3-187.2-359.5-1248.7-3097.3C869.2-2861.7,333.1-4252.9,322.5-4295.5c-40.4-161.7,68.1-378.7,229.7-451c74.5-34,140.4-40.4,448.9-40.4c410.6,0,491.4,21.3,591.4,153.2c34,44.7,148.9,342.5,299.9,772.2l242.5,702h1346.6h1348.7l217-585c119.1-321.2,236.1-640.3,261.6-706.2c55.3-153.2,131.9-244.6,244.7-295.7c117-53.2,776.4-59.6,899.8-8.5c157.4,65.9,259.5,217,259.5,380.8c0,76.6-244.6,708.4-1216.8,3144.1c-1327.4,3331.3-1257.2,3171.7-1452.9,3227C3938.8,2024.6,3009.2,2024.6,2909.2,1996.9z M3945.2-851.5l444.6-1201.9l-906.2-6.4c-497.8-2.1-908.3,0-912.6,4.3c-6.4,6.4,782.8,2216.6,878.6,2459.1C3466.6,446.2,3441.1,514.2,3945.2-851.5z"/></g></svg>
<div class="button_label">Text CAPTCHA</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
</div>
<div class='footer'>
<!-- <div>Join us on <a href="https://nopecha.com/discord">Discord</a> for more credits!</div>
<div id="export" class="hidden">Export Settings</div> -->
</div>
</div>
<!-- SETTINGS TAB -->
<div class="tab hidden" data-tab="settings">
<div class="header">
<button class="nav_icon back" data-tabtarget="main">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">Settings</div>
</div>
<button class="nav_icon" disabled></button>
</div>
<div class="menu">
<button class="menu_item_container" data-tabtarget="disabled_hosts">
<div class="button_label_container">
<svg width="16" height="16"><path fill-rule="evenodd" d="M3 2v12h10V2H3zm0-2h10a2 2 0 012 2v12a2 2 0 01-2 2H3a2 2 0 01-2-2V2a2 2 0 012-2zm2 4h6v2H5V4zm0 4h3v2H5V8z"></path></svg>
<div class="button_label">Disabled Hosts</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
<button id="export" class="menu_item_container hidden">
<div class="button_label_container">
<svg width="16" height="16" fill="#ffffff"><path d="M8 0A8 8 0 11.582 11.001h2.221a6 6 0 100-6L.582 4.999A8.003 8.003 0 018 0zM7 5l4 3-4 3-.001-2H0V7h6.999L7 5z"></path></svg>
<div class="button_label">Export Settings</div>
</div>
<div class="icon-container">
<svg viewBox="0 0 16 16" class="menu_item_arrow"><path fill-rule="evenodd" d="M10.3 8l-6-6.3a1 1 0 010-1.4 1 1 0 011.3 0L13 8l-7.4 7.7a1 1 0 01-1.3 0 1 1 0 010-1.4l6-6.3z"></path></svg>
</div>
</button>
</div>
<!-- LINKS -->
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div style="margin-top: 16px; margin-bottom: 8px; padding-bottom: 4px; font-size: 14px; font-weight: bold; border-bottom: 1px solid rgba(255, 255, 255, 0.5);">
Links
</div>
<div class="settings_item_container">
<a href="https://developers.nopecha.com" target="_blank">
<div class="settings_description_container bbflex">
<svg width="16" height="16" fill="#ffffff"><path d="M14 2.894a3.898 3.898 0 00-4.808-.126L8 3.653l-1.192-.885A3.898 3.898 0 002 2.894v9.716a7.676 7.676 0 016.006.864A7.705 7.705 0 0114 12.621V2.894zm2 10.584v1.687l-.66.275a5.652 5.652 0 00-1.34-.72V12.62c.695.187 1.37.472 2 .857zM0 2.027l.403-.387A5.898 5.898 0 018 1.162a5.898 5.898 0 017.597.478l.403.387V16a5.692 5.692 0 00-8 0 5.663 5.663 0 00-8 0V2.027zm7-.019h2v12H7v-12z"></path></svg>
<div class="settings_description">Documentation</div>
</div>
</a>
</div>
<div class="settings_item_container">
<a href="https://nopecha.com" target="_blank">
<div class="settings_description_container bbflex">
<svg width="16" height="16" viewBox="2 2 22 22" fill="#ffffff"><path d="M 12 2.0996094 L 1 12 L 4 12 L 4 21 L 11 21 L 11 15 L 13 15 L 13 21 L 20 21 L 20 12 L 23 12 L 12 2.0996094 z M 12 4.7910156 L 18 10.191406 L 18 11 L 18 19 L 15 19 L 15 13 L 9 13 L 9 19 L 6 19 L 6 10.191406 L 12 4.7910156 z"/></svg>
<div class="settings_description">Homepage</div>
</div>
</a>
</div>
<div class="settings_item_container">
<a href="https://nopecha.com/discord" target="_blank">
<div class="settings_description_container bbflex">
<svg width="16" height="16" viewBox="0 0 71 55" fill="#ffffff"><path d="M60.1045 4.8978C55.5792 2.8214 50.7265 1.2916 45.6527 0.41542C45.5603 0.39851 45.468 0.440769 45.4204 0.525289C44.7963 1.6353 44.105 3.0834 43.6209 4.2216C38.1637 3.4046 32.7345 3.4046 27.3892 4.2216C26.905 3.0581 26.1886 1.6353 25.5617 0.525289C25.5141 0.443589 25.4218 0.40133 25.3294 0.41542C20.2584 1.2888 15.4057 2.8186 10.8776 4.8978C10.8384 4.9147 10.8048 4.9429 10.7825 4.9795C1.57795 18.7309 -0.943561 32.1443 0.293408 45.3914C0.299005 45.4562 0.335386 45.5182 0.385761 45.5576C6.45866 50.0174 12.3413 52.7249 18.1147 54.5195C18.2071 54.5477 18.305 54.5139 18.3638 54.4378C19.7295 52.5728 20.9469 50.6063 21.9907 48.5383C22.0523 48.4172 21.9935 48.2735 21.8676 48.2256C19.9366 47.4931 18.0979 46.6 16.3292 45.5858C16.1893 45.5041 16.1781 45.304 16.3068 45.2082C16.679 44.9293 17.0513 44.6391 17.4067 44.3461C17.471 44.2926 17.5606 44.2813 17.6362 44.3151C29.2558 49.6202 41.8354 49.6202 53.3179 44.3151C53.3935 44.2785 53.4831 44.2898 53.5502 44.3433C53.9057 44.6363 54.2779 44.9293 54.6529 45.2082C54.7816 45.304 54.7732 45.5041 54.6333 45.5858C52.8646 46.6197 51.0259 47.4931 49.0921 48.2228C48.9662 48.2707 48.9102 48.4172 48.9718 48.5383C50.038 50.6034 51.2554 52.5699 52.5959 54.435C52.6519 54.5139 52.7526 54.5477 52.845 54.5195C58.6464 52.7249 64.529 50.0174 70.6019 45.5576C70.6551 45.5182 70.6887 45.459 70.6943 45.3942C72.1747 30.0791 68.2147 16.7757 60.1968 4.9823C60.1772 4.9429 60.1437 4.9147 60.1045 4.8978ZM23.7259 37.3253C20.2276 37.3253 17.3451 34.1136 17.3451 30.1693C17.3451 26.225 20.1717 23.0133 23.7259 23.0133C27.308 23.0133 30.1626 26.2532 30.1066 30.1693C30.1066 34.1136 27.28 37.3253 23.7259 37.3253ZM47.3178 37.3253C43.8196 37.3253 40.9371 34.1136 40.9371 30.1693C40.9371 26.225 43.7636 23.0133 47.3178 23.0133C50.9 23.0133 53.7545 26.2532 53.6986 30.1693C53.6986 34.1136 50.9 37.3253 47.3178 37.3253Z"/></svg>
<div class="settings_description">Discord</div>
</div>
</a>
</div>
<div class="settings_item_container">
<a href="https://nopecha.com/github" target="_blank">
<div class="settings_description_container bbflex">
<svg width="16" height="16" viewBox="0 0 24 24" fill="#ffffff"><path d="M12 0c-6.626 0-12 5.373-12 12 0 5.302 3.438 9.8 8.207 11.387.599.111.793-.261.793-.577v-2.234c-3.338.726-4.033-1.416-4.033-1.416-.546-1.387-1.333-1.756-1.333-1.756-1.089-.745.083-.729.083-.729 1.205.084 1.839 1.237 1.839 1.237 1.07 1.834 2.807 1.304 3.492.997.107-.775.418-1.305.762-1.604-2.665-.305-5.467-1.334-5.467-5.931 0-1.311.469-2.381 1.236-3.221-.124-.303-.535-1.524.117-3.176 0 0 1.008-.322 3.301 1.23.957-.266 1.983-.399 3.003-.404 1.02.005 2.047.138 3.006.404 2.291-1.552 3.297-1.23 3.297-1.23.653 1.653.242 2.874.118 3.176.77.84 1.235 1.911 1.235 3.221 0 4.609-2.807 5.624-5.479 5.921.43.372.823 1.102.823 2.222v3.293c0 .319.192.694.801.576 4.765-1.589 8.199-6.086 8.199-11.386 0-6.627-5.373-12-12-12z"/></svg>
<div class="settings_description">GitHub</div>
</div>
</a>
</div>
</div>
</div>
</div>
<div class='footer'></div>
</div>
<!-- DISABLED HOSTS TAB -->
<div class="tab hidden" data-tab="disabled_hosts">
<div class="header">
<button class="nav_icon back" data-tabtarget="settings">
<svg width="12" height="12" viewBox="0 0 9 16"><path d="M2.684 8l6.038-6.308c.37-.387.37-1.015 0-1.402a.92.92 0 00-1.342 0L0 8l7.38 7.71a.92.92 0 001.342 0c.37-.387.37-1.015 0-1.402L2.684 8z"></path></svg>
</button>
<div class="header_label_container">
<div class="header_label">Disabled Hosts</div>
</div>
<button class="nav_icon" disabled></button>
</div>
<div style="position: relative; overflow: hidden; width: 100%; height: auto; min-height: 0px; max-height: 402px;">
<div style="position: relative; overflow: auto; margin-bottom: -15px; min-height: 15px; max-height: 417px;">
<div class="scrolling_container">
<div class="css-rghnfo">
<div class="settings_item_header">Current Page</div>
<div class="settings_item_container list_item">
<div class="list_item_row">
<div id="current_page_host">-</div>
<button id="add_current_page_host" class="list_item_button">
<svg width="16" height="16"><path fill="rgb(0, 106, 255)" fill-rule="evenodd" d="M9 7h6a1 1 0 110 2H9v6a1 1 0 11-2 0V9H1a1 1 0 110-2h6V1a1 1 0 112 0v6z"></path></svg>
</button>
</div>
</div>
</div>
<div>
<div class="settings_item_header">Disabled Hosts</div>
<div id="disabled_hosts"></div>
</div>
</div>
</div>
</div>
</div>
</div>
<script src="utils.js"></script>
<script src="content.js"></script>
<script src="popup.js"></script>
</body>
</html>

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -1 +0,0 @@
(async()=>{function e(){var e="true"===document.querySelector(".recaptcha-checkbox")?.getAttribute("aria-checked"),t=document.querySelector("#recaptcha-verify-button")?.disabled;return e||t}function d(r=15e3){return new Promise(async e=>{for(var t=Time.time();;){var a=document.querySelectorAll(".rc-imageselect-tile"),c=document.querySelectorAll(".rc-imageselect-dynamic-selected");if(0<a.length&&0===c.length)return e(!0);if(Time.time()-t>r)return e(!1);await Time.sleep(100)}})}let p=null;function a(e=500){return new Promise(m=>{let h=!1;const f=setInterval(async()=>{if(!h){h=!0;var c=document.querySelector(".rc-imageselect-instructions")?.innerText?.split("\n"),r=await async function(e){let t=null;return(t=1<e.length?(t=e.slice(0,2).join(" ")).replace(/\s+/g," ")?.trim():t.join("\n"))||null}(c);if(r){var c=3===c.length,i=document.querySelectorAll("table tr td");if(9===i.length||16===i.length){var l=[],n=Array(i.length).fill(null);let e=null,t=!1,a=0;for(const u of i){var o=u?.querySelector("img");if(!o)return void(h=!1);var s=o?.src?.trim();if(!s||""===s)return void(h=!1);300<=o.naturalWidth?e=s:100==o.naturalWidth&&(n[a]=s,t=!0),l.push(u),a++}t&&(e=null);i=JSON.stringify([e,n]);if(p!==i)return p=i,clearInterval(f),h=!1,m({task:r,is_hard:c,cells:l,background_url:e,urls:n})}}h=!1}},e)})}async function t(){!0===await BG.exec("Cache.get",{name:"recaptcha_widget_visible",tab_specific:!0})&&(e()?r=r||!0:(r=!1,await Time.sleep(500),document.querySelector("#recaptcha-anchor")?.click()))}async function c(){var c=await BG.exec("Cache.get",{name:"recaptcha_image_visible",tab_specific:!0});if(!0===c&&(null===document.querySelector(".rc-doscaptcha-header")&&!e()))if(g=!(g||!function(){for(const e of[".rc-imageselect-incorrect-response"])if(""===document.querySelector(e)?.style.display)return 1}()||(y=[],0)),function(){for(const t of[".rc-imageselect-error-select-more",".rc-imageselect-error-dynamic-more",".rc-imageselect-error-select-something"]){var e=document.querySelector(t);if(""===e?.style.display||0===e?.tabIndex)return 1}}())y=[];else if(await d()){var{task:c,is_hard:r,cells:t,background_url:i,urls:l}=await a(),n=await BG.exec("Settings.get");if(n&&n.enabled&&n.recaptcha_auto_solve){var o=9==t.length?3:4,s=[];let e,a=[];if(null===i){e="1x1";for(let e=0;e<l.length;e++){var u=l[e],m=t[e];u&&!y.includes(u)&&(s.push(u),a.push(m))}}else s.push(i),e=o+"x"+o,a=t;var i=Time.time(),h=(await NopeCHA.post({captcha_type:IS_DEVELOPMENT?"recaptcha_dev":"recaptcha",task:c,image_urls:s,grid:e,key:n.key}))["data"];if(h){c=parseInt(n.recaptcha_solve_delay_time)||1e3,n=n.recaptcha_solve_delay?c-(Time.time()-i):0;0<n&&await Time.sleep(n);let t=0;for(let e=0;e<h.length;e++)!1!==h[e]&&(t++,function(e){try{return e.classList.contains("rc-imageselect-tileselected")}catch{}}(a[e])||(a[e]?.click(),await Time.sleep(100*Math.random()+200)));for(const f of l)y.push(f),9<y.length&&y.shift();(3==o&&r&&0===t&&await d()||3==o&&!r||4==o)&&(await Time.sleep(200),document.querySelector("#recaptcha-verify-button")?.click())}}}}let r=!1,g=!1,y=[];for(;;){await Time.sleep(1e3);var i,l=await BG.exec("Settings.get");l&&l.enabled&&"Image"===l.recaptcha_solve_method&&(i=await Location.hostname(),l.disabled_hosts.includes(i)||(await async function(){var e=[...document.querySelectorAll('iframe[src*="/recaptcha/api2/bframe"]'),...document.querySelectorAll('iframe[src*="/recaptcha/enterprise/bframe"]')];if(0<e.length){for(const t of e)if("visible"===window.getComputedStyle(t).visibility)return BG.exec("Cache.set",{name:"recaptcha_image_visible",value:!0,tab_specific:!0});await BG.exec("Cache.set",{name:"recaptcha_image_visible",value:!1,tab_specific:!0})}}(),await async function(){var e=[...document.querySelectorAll('iframe[src*="/recaptcha/api2/anchor"]'),...document.querySelectorAll('iframe[src*="/recaptcha/enterprise/anchor"]')];if(0<e.length){for(const t of e)if("visible"===window.getComputedStyle(t).visibility)return BG.exec("Cache.set",{name:"recaptcha_widget_visible",value:!0,tab_specific:!0});await BG.exec("Cache.set",{name:"recaptcha_widget_visible",value:!1,tab_specific:!0})}}(),l.recaptcha_auto_open&&null!==document.querySelector(".recaptcha-checkbox")?await t():l.recaptcha_auto_solve&&null!==document.querySelector("#rc-imageselect")&&await c()))}})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{let i=null,t=!1,n=!1;function s(e){let t=e;for(;t&&!t.classList?.contains("rc-imageselect-tile");)t=t.parentNode;return t}function a(e,t,n=!1){!e||!n&&i===e||(!0===t&&e.classList.contains("rc-imageselect-tileselected")||!1===t&&!e.classList.contains("rc-imageselect-tileselected"))&&e.click()}document.addEventListener("mousedown",e=>{e=s(e?.target);e&&(n=e.classList.contains("rc-imageselect-tileselected")?t=!0:!(t=!0),i=e)}),document.addEventListener("mouseup",e=>{t=!1,i=null}),document.addEventListener("mousemove",e=>{e=s(e?.target);t&&(i!==e&&null!==i&&a(i,n,!0),a(e,n))});window.addEventListener("load",()=>{var e=document.body.appendChild(document.createElement("style")).sheet;e.insertRule(".rc-imageselect-table-33, .rc-imageselect-table-42, .rc-imageselect-table-44 {transition-duration: 0.5s !important}",0),e.insertRule(".rc-imageselect-tile {transition-duration: 2s !important}",1),e.insertRule(".rc-imageselect-dynamic-selected {transition-duration: 1s !important}",2),e.insertRule(".rc-imageselect-progress {transition-duration: 0.5s !important}",3),e.insertRule(".rc-image-tile-overlay {transition-duration: 0.5s !important}",4),e.insertRule("#rc-imageselect img {pointer-events: none !important}",5)})})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{function e(){var e,t;if(!i())return e="true"===document.querySelector(".recaptcha-checkbox")?.getAttribute("aria-checked"),t=document.querySelector("#recaptcha-verify-button")?.disabled,e||t}function i(){return"Try again later"===document.querySelector(".rc-doscaptcha-header")?.innerText}async function t(){!0!==await BG.exec("Cache.get",{name:"recaptcha_widget_visible",tab_specific:!0})||e()||(await Time.sleep(500),document.querySelector("#recaptcha-anchor")?.click())}async function a(t){var a=await BG.exec("Cache.get",{name:"recaptcha_image_visible",tab_specific:!0});if(!0===a&&!e()&&!i()){a=document.querySelector(".rc-audiochallenge-tdownload-link")?.href,a=(fetch(a),document.querySelector("#audio-source")?.src?.replace("recaptcha.net","google.com"));let e=document.querySelector("html")?.getAttribute("lang")?.trim();e&&0!==e.length||(e="en");var c=Time.time(),a=await Net.fetch("https://engageub.pythonanywhere.com",{method:"POST",headers:{"Content-Type":"application/x-www-form-urlencoded"},body:"input="+encodeURIComponent(a)+"&lang="+e});document.querySelector("#audio-response").value=a;a=parseInt(t.recaptcha_solve_delay_time)||1e3,t=t.recaptcha_solve_delay?a-(Time.time()-c):0;0<t&&await Time.sleep(t),document.querySelector("#recaptcha-verify-button")?.click()}}for(;;){await Time.sleep(1e3);var c,r=await BG.exec("Settings.get");r&&r.enabled&&"Speech"===r.recaptcha_solve_method&&(c=await Location.hostname(),r.disabled_hosts.includes(c)||(await async function(){var e=[...document.querySelectorAll('iframe[src*="/recaptcha/api2/bframe"]'),...document.querySelectorAll('iframe[src*="/recaptcha/enterprise/bframe"]')];if(0<e.length){for(const t of e)if("visible"===window.getComputedStyle(t).visibility)return BG.exec("Cache.set",{name:"recaptcha_image_visible",value:!0,tab_specific:!0});await BG.exec("Cache.set",{name:"recaptcha_image_visible",value:!1,tab_specific:!0})}}(),await async function(){var e=[...document.querySelectorAll('iframe[src*="/recaptcha/api2/anchor"]'),...document.querySelectorAll('iframe[src*="/recaptcha/enterprise/anchor"]')];if(0<e.length){for(const t of e)if("visible"===window.getComputedStyle(t).visibility)return BG.exec("Cache.set",{name:"recaptcha_widget_visible",value:!0,tab_specific:!0});await BG.exec("Cache.set",{name:"recaptcha_widget_visible",value:!1,tab_specific:!0})}}(),r.recaptcha_auto_open&&null!==document.querySelector(".recaptcha-checkbox")?await t():r.recaptcha_auto_solve&&null!==document.querySelector(".rc-imageselect-instructions")?await(!0===await BG.exec("Cache.get",{name:"recaptcha_image_visible",tab_specific:!0})&&!e()&&(await Time.sleep(500),!document.querySelector("#recaptcha-audio-button")?.click())):!r.recaptcha_auto_solve||null===document.querySelector("#audio-instructions")&&null===document.querySelector(".rc-doscaptcha-header")||await a(r)))}})();

Wyświetl plik

@ -1,3 +0,0 @@
(async()=>{function e(){try{function t(t){return`<p style='font-family: monospace; font-size: 12px; white-space: pre;'>${t}</p>`}var e=[];for(const o of arguments)e.push(t(o));e.push(t('Join us on <a href="https://nopecha.com/discord" target="_blank">Discord</a>')),document.body.innerHTML=e.join("<hr>")}catch(t){}}try{var t,o;document.location.hash?(document.title="NopeCHA Setup",e("Importing NopeCHA Settings..."),await BG.exec("Settings.get"),t=SettingsManager.import(document.location.hash),e(`Visiting this URL will import your NopeCHA settings.
<a href="${o=window.location.href}">${o}</a>`,`Successfully imported settings.
`+JSON.stringify(t,null,4))):e("Invalid URL.\nPlease set the URL hash and reload the page.","Example: https://nopecha.com/setup#TESTKEY123")}catch(t){e("Failed to import settings.\nPlease verify that your URL is formed properly.")}})();

Wyświetl plik

@ -1 +0,0 @@
(async()=>{async function r(t){function c(a){return new Promise(t=>{const e=new Image;e.onload=()=>t(e),e.src=function(t){let e=t.style.backgroundImage;return e&&((t=e.trim().match(/(?!^)".*?"/g))&&0!==t.length||(e=null),e=t[0].replaceAll('"',"")),e}(a)})}try{return(await async function(t){var e=document.querySelector(t);if(e instanceof HTMLCanvasElement)return e;let a;if(a=e instanceof HTMLImageElement?e:await c(e))return(e=document.createElement("canvas")).width=a.naturalWidth,e.height=a.naturalHeight,e.getContext("2d").drawImage(a,0,0),e;throw Error("failed to get image element for "+t)}(t)).toDataURL("image/jpeg").split(";base64,")[1]}catch(t){return null}}let l=null;async function t(){var t,e,a,c,n=(t=500,await new Promise(e=>{let a=!1;const c=setInterval(async()=>{if(!a){a=!0;var t=await BG.exec("Settings.get");if(t&&t.textcaptcha_auto_solve){t=await r(t.textcaptcha_image_selector);if(t&&l!==t)return l=t,clearInterval(c),a=!1,e({image_data:t})}a=!1}},t)}))["image_data"],i=await BG.exec("Settings.get");i&&i.enabled&&i.textcaptcha_auto_solve&&(c=Time.time(),{job_id:e,data:n}=await NopeCHA.post({captcha_type:IS_DEVELOPMENT?"textcaptcha_dev":"textcaptcha",image_data:[n],key:i.key}),n)&&(a=(a=parseInt(i.textcaptcha_solve_delay_time))||100,0<(a=i.textcaptcha_solve_delay?a-(Time.time()-c):0)&&await Time.sleep(a),n)&&0<n.length&&(c=document.querySelector(i.textcaptcha_input_selector))&&!c.value&&(c.value=n[0])}for(;;){await Time.sleep(1e3);var e,a=await BG.exec("Settings.get");a&&a.enabled&&(e=await Location.hostname(),a.disabled_hosts.includes(e)||a.textcaptcha_auto_solve&&function(t){try{var e;if(t?.textcaptcha_image_selector&&t?.textcaptcha_input_selector)return document.querySelector(t.textcaptcha_image_selector)&&!(!(e=document.querySelector(t.textcaptcha_input_selector))||e.value)}catch(t){}}(a)&&await t())}})();

Wyświetl plik

@ -1,272 +0,0 @@
'use strict';
/**
* Set to true for the following behavior:
* - Request server to recognize using bleeding-edge models
* - Reload FunCAPTCHA on verification
*/
const IS_DEVELOPMENT = false;
// export const BASE_API = 'https://dev-api.nopecha.com';
const BASE_API = 'https://api.nopecha.com';
/**
* Trying to be an Enum but javascript doesn't have enums
*/
class RunningAs {
// Background script running on-demand
static BACKGROUND = 'BACKGROUND';
// Popup specified in manifest as "action"
static POPUP = 'POPUP';
// Content script running in page
static CONTENT = 'CONTENT';
// (somehow) Standalone run of script running in webpage
static WEB = 'WEB';
}
Object.freeze(RunningAs);
const runningAt = (() => {
let getBackgroundPage = globalThis?.chrome?.extension?.getBackgroundPage;
if (getBackgroundPage){
return getBackgroundPage() === window ? RunningAs.BACKGROUND : RunningAs.POPUP;
}
return globalThis?.chrome?.runtime?.onMessage ? RunningAs.CONTENT : RunningAs.WEB;
})();
function deep_copy(obj) {
return JSON.parse(JSON.stringify(obj));
}
class Util {
static CHARS = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
static pad_left(s, char, n) {
while (`${s}`.length < n) {
s = `${char}${s}`;
}
return s;
}
static capitalize(s) {
return s.charAt(0).toUpperCase() + s.slice(1);
}
static parse_int(s, fallback) {
if (!s) {
s = fallback;
}
return parseInt(s);
}
static parse_bool(s, fallback) {
if (s === 'true') {
s = true;
}
else if (s === 'false') {
s = false;
}
else {
s = fallback;
}
return s;
}
static parse_string(s, fallback) {
if (!s) {
s = fallback;
}
return s;
}
static parse_json(s, fallback) {
if (!s) {
s = fallback;
}
else {
s = JSON.parse(s);
}
return s;
}
static generate_id(n) {
let result = '';
for (let i = 0; i < n; i++) {
result += Util.CHARS.charAt(Math.floor(Math.random() * Util.CHARS.length));
}
return result;
}
}
class Time {
static time() {
if (!Date.now) {
Date.now = () => new Date().getTime();
}
return Date.now();
}
static date() {
return new Date();
}
static sleep(i=1000) {
return new Promise(resolve => setTimeout(resolve, i));
}
static async random_sleep(min, max) {
const duration = Math.floor(Math.random() * (max - min) + min);
return await Time.sleep(duration);
}
static seconds_as_hms(t) {
t = Math.max(0, t);
const hours = Util.pad_left(Math.floor(t / 3600), '0', 2);
t %= 3600;
const minutes = Util.pad_left(Math.floor(t / 60), '0', 2);
const seconds = Util.pad_left(Math.floor(t % 60), '0', 2);
return `${hours}:${minutes}:${seconds}`;
}
static string(d=null) {
if (!d) {
d = Time.date();
}
const month = Util.pad_left(d.getMonth() + 1, '0', 2);
const date = Util.pad_left(d.getDate(), '0', 2);
const year = d.getFullYear();
const hours = Util.pad_left(d.getHours() % 12, '0', 2);
const minutes = Util.pad_left(d.getMinutes(), '0', 2);
const seconds = Util.pad_left(d.getSeconds(), '0', 2);
const period = d.getHours() >= 12 ? 'PM' : 'AM';
return `${month}/${date}/${year} ${hours}:${minutes}:${seconds} ${period}`;
}
}
class SettingsManager {
static DEFAULT = {
version: 15,
key: '',
enabled: true,
disabled_hosts: [],
hcaptcha_auto_open: true,
hcaptcha_auto_solve: true,
hcaptcha_solve_delay: true,
hcaptcha_solve_delay_time: 3000,
recaptcha_auto_open: true,
recaptcha_auto_solve: true,
recaptcha_solve_delay: true,
recaptcha_solve_delay_time: 1000,
recaptcha_solve_method: 'Image',
funcaptcha_auto_open: true,
funcaptcha_auto_solve: true,
funcaptcha_solve_delay: true,
funcaptcha_solve_delay_time: 1000,
awscaptcha_auto_open: true,
awscaptcha_auto_solve: true,
awscaptcha_solve_delay: true,
awscaptcha_solve_delay_time: 1000,
textcaptcha_auto_solve: true,
textcaptcha_solve_delay: true,
textcaptcha_solve_delay_time: 100,
textcaptcha_image_selector: '',
textcaptcha_input_selector: '',
};
static ENCODE_FIELDS = {
enabled: {parse: Util.parse_bool, encode: encodeURIComponent},
disabled_hosts: {parse: Util.parse_json, encode: e => encodeURIComponent(JSON.stringify(e))},
hcaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
hcaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
hcaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
hcaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
recaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
recaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
recaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
recaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
recaptcha_solve_method: {parse: Util.parse_string, encode: encodeURIComponent},
funcaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
funcaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
funcaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
funcaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
awscaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
awscaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
awscaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
awscaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
textcaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
textcaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
textcaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
textcaptcha_image_selector: {parse: Util.parse_string, encode: encodeURIComponent},
textcaptcha_input_selector: {parse: Util.parse_string, encode: encodeURIComponent},
};
static IMPORT_URL = 'https://nopecha.com/setup';
static DELIMITER = '|';
static export(settings) {
if (!settings.key) {
return false;
}
const fields = [settings.key];
for (const k in SettingsManager.ENCODE_FIELDS) {
fields.push(`${k}=${SettingsManager.ENCODE_FIELDS[k].encode(settings[k])}`);
}
const encoded_hash = `#${fields.join(SettingsManager.DELIMITER)}`;
return `${SettingsManager.IMPORT_URL}${encoded_hash}`;
}
static import(encoded_hash) {
const settings = {};
// Split by delimiter
const fields = encoded_hash.split(SettingsManager.DELIMITER);
if (fields.length === 0) {
return settings;
}
// Parse key
const key = fields.shift();
if (key.length <= 1) {
console.error('invalid key for settings', key);
return settings;
}
settings.key = key.substring(1);
// Parse additional fields
for (const field of fields) {
const kv = field.split('=');
const k = kv.shift();
const v_raw = kv.join('=');
if (!(k in SettingsManager.ENCODE_FIELDS)) {
console.error('invalid field for settings', field);
continue;
}
const v = decodeURIComponent(v_raw);
console.log('v', v);
settings[k] = SettingsManager.ENCODE_FIELDS[k].parse(v, SettingsManager.DEFAULT[k]);
}
return settings;
}
}

Wyświetl plik

@ -1,272 +0,0 @@
'use strict';
/**
* Set to true for the following behavior:
* - Request server to recognize using bleeding-edge models
* - Reload FunCAPTCHA on verification
*/
export const IS_DEVELOPMENT = false;
// export const BASE_API = 'https://dev-api.nopecha.com';
export const BASE_API = 'https://api.nopecha.com';
/**
* Trying to be an Enum but javascript doesn't have enums
*/
export class RunningAs {
// Background script running on-demand
static BACKGROUND = 'BACKGROUND';
// Popup specified in manifest as "action"
static POPUP = 'POPUP';
// Content script running in page
static CONTENT = 'CONTENT';
// (somehow) Standalone run of script running in webpage
static WEB = 'WEB';
}
Object.freeze(RunningAs);
export const runningAt = (() => {
let getBackgroundPage = globalThis?.chrome?.extension?.getBackgroundPage;
if (getBackgroundPage){
return getBackgroundPage() === window ? RunningAs.BACKGROUND : RunningAs.POPUP;
}
return globalThis?.chrome?.runtime?.onMessage ? RunningAs.CONTENT : RunningAs.WEB;
})();
export function deep_copy(obj) {
return JSON.parse(JSON.stringify(obj));
}
export class Util {
static CHARS = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
static pad_left(s, char, n) {
while (`${s}`.length < n) {
s = `${char}${s}`;
}
return s;
}
static capitalize(s) {
return s.charAt(0).toUpperCase() + s.slice(1);
}
static parse_int(s, fallback) {
if (!s) {
s = fallback;
}
return parseInt(s);
}
static parse_bool(s, fallback) {
if (s === 'true') {
s = true;
}
else if (s === 'false') {
s = false;
}
else {
s = fallback;
}
return s;
}
static parse_string(s, fallback) {
if (!s) {
s = fallback;
}
return s;
}
static parse_json(s, fallback) {
if (!s) {
s = fallback;
}
else {
s = JSON.parse(s);
}
return s;
}
static generate_id(n) {
let result = '';
for (let i = 0; i < n; i++) {
result += Util.CHARS.charAt(Math.floor(Math.random() * Util.CHARS.length));
}
return result;
}
}
export class Time {
static time() {
if (!Date.now) {
Date.now = () => new Date().getTime();
}
return Date.now();
}
static date() {
return new Date();
}
static sleep(i=1000) {
return new Promise(resolve => setTimeout(resolve, i));
}
static async random_sleep(min, max) {
const duration = Math.floor(Math.random() * (max - min) + min);
return await Time.sleep(duration);
}
static seconds_as_hms(t) {
t = Math.max(0, t);
const hours = Util.pad_left(Math.floor(t / 3600), '0', 2);
t %= 3600;
const minutes = Util.pad_left(Math.floor(t / 60), '0', 2);
const seconds = Util.pad_left(Math.floor(t % 60), '0', 2);
return `${hours}:${minutes}:${seconds}`;
}
static string(d=null) {
if (!d) {
d = Time.date();
}
const month = Util.pad_left(d.getMonth() + 1, '0', 2);
const date = Util.pad_left(d.getDate(), '0', 2);
const year = d.getFullYear();
const hours = Util.pad_left(d.getHours() % 12, '0', 2);
const minutes = Util.pad_left(d.getMinutes(), '0', 2);
const seconds = Util.pad_left(d.getSeconds(), '0', 2);
const period = d.getHours() >= 12 ? 'PM' : 'AM';
return `${month}/${date}/${year} ${hours}:${minutes}:${seconds} ${period}`;
}
}
export class SettingsManager {
static DEFAULT = {
version: 15,
key: '',
enabled: true,
disabled_hosts: [],
hcaptcha_auto_open: true,
hcaptcha_auto_solve: true,
hcaptcha_solve_delay: true,
hcaptcha_solve_delay_time: 3000,
recaptcha_auto_open: true,
recaptcha_auto_solve: true,
recaptcha_solve_delay: true,
recaptcha_solve_delay_time: 1000,
recaptcha_solve_method: 'Image',
funcaptcha_auto_open: true,
funcaptcha_auto_solve: true,
funcaptcha_solve_delay: true,
funcaptcha_solve_delay_time: 1000,
awscaptcha_auto_open: true,
awscaptcha_auto_solve: true,
awscaptcha_solve_delay: true,
awscaptcha_solve_delay_time: 1000,
textcaptcha_auto_solve: true,
textcaptcha_solve_delay: true,
textcaptcha_solve_delay_time: 100,
textcaptcha_image_selector: '',
textcaptcha_input_selector: '',
};
static ENCODE_FIELDS = {
enabled: {parse: Util.parse_bool, encode: encodeURIComponent},
disabled_hosts: {parse: Util.parse_json, encode: e => encodeURIComponent(JSON.stringify(e))},
hcaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
hcaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
hcaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
hcaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
recaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
recaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
recaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
recaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
recaptcha_solve_method: {parse: Util.parse_string, encode: encodeURIComponent},
funcaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
funcaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
funcaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
funcaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
awscaptcha_auto_open: {parse: Util.parse_bool, encode: encodeURIComponent},
awscaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
awscaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
awscaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
textcaptcha_auto_solve: {parse: Util.parse_bool, encode: encodeURIComponent},
textcaptcha_solve_delay: {parse: Util.parse_bool, encode: encodeURIComponent},
textcaptcha_solve_delay_time: {parse: Util.parse_int, encode: encodeURIComponent},
textcaptcha_image_selector: {parse: Util.parse_string, encode: encodeURIComponent},
textcaptcha_input_selector: {parse: Util.parse_string, encode: encodeURIComponent},
};
static IMPORT_URL = 'https://nopecha.com/setup';
static DELIMITER = '|';
static export(settings) {
if (!settings.key) {
return false;
}
const fields = [settings.key];
for (const k in SettingsManager.ENCODE_FIELDS) {
fields.push(`${k}=${SettingsManager.ENCODE_FIELDS[k].encode(settings[k])}`);
}
const encoded_hash = `#${fields.join(SettingsManager.DELIMITER)}`;
return `${SettingsManager.IMPORT_URL}${encoded_hash}`;
}
static import(encoded_hash) {
const settings = {};
// Split by delimiter
const fields = encoded_hash.split(SettingsManager.DELIMITER);
if (fields.length === 0) {
return settings;
}
// Parse key
const key = fields.shift();
if (key.length <= 1) {
console.error('invalid key for settings', key);
return settings;
}
settings.key = key.substring(1);
// Parse additional fields
for (const field of fields) {
const kv = field.split('=');
const k = kv.shift();
const v_raw = kv.join('=');
if (!(k in SettingsManager.ENCODE_FIELDS)) {
console.error('invalid field for settings', field);
continue;
}
const v = decodeURIComponent(v_raw);
console.log('v', v);
settings[k] = SettingsManager.ENCODE_FIELDS[k].parse(v, SettingsManager.DEFAULT[k]);
}
return settings;
}
}

Wyświetl plik

@ -2,7 +2,7 @@ import { defineConfig } from 'tsup'
export default defineConfig([
{
entry: ['src/index.ts'],
entry: ['src/index.ts', 'src/cli.ts'],
outDir: 'build',
target: 'node16',
platform: 'node',