Compare commits

...

80 Commits

Author SHA1 Message Date
taizan-hokouto
aaf9860bdc Merge branch 'release/v0.4.7' 2020-11-18 01:25:10 +09:00
taizan-hokouto
83ad4dcf1f Increment version 2020-11-18 01:24:37 +09:00
taizan-hokouto
765251b872 Merge branch 'feature/pipenv' into develop 2020-11-18 01:17:35 +09:00
taizan-hokouto
7ea88fead2 Modify requirements 2020-11-18 01:16:49 +09:00
taizan-hokouto
ea67e3e54e Add pipenv files 2020-11-18 01:16:01 +09:00
taizan-hokouto
a5c7ba52c8 Merge branch 'hotfix/test' 2020-11-17 01:11:22 +09:00
taizan-hokouto
7cf780ee87 Merge branch 'master' into develop 2020-11-17 01:11:22 +09:00
taizan-hokouto
c37201fa03 Remove tests 2020-11-17 01:10:54 +09:00
taizan-hokouto
6fcc1393de Merge branch 'master' into develop 2020-11-17 01:01:56 +09:00
taizan-hokouto
a474899268 Merge branch 'hotfix/tests' 2020-11-17 01:00:39 +09:00
taizan-hokouto
3f72eb0e00 Remove tests 2020-11-17 00:59:48 +09:00
taizan-hokouto
661d1e4b81 Fix tests 2020-11-17 00:54:32 +09:00
taizan-hokouto
4652a56bc6 Merge branch 'hotfix/json' 2020-11-16 23:32:32 +09:00
taizan-hokouto
966320cab5 Merge branch 'master' into develop 2020-11-16 23:32:32 +09:00
taizan-hokouto
35218a66da Remove unnecessary import 2020-11-16 23:32:14 +09:00
taizan-hokouto
3432609588 Merge branch 'hotfix/json' 2020-11-16 23:29:50 +09:00
taizan-hokouto
3ad6b7e845 Merge branch 'master' into develop 2020-11-16 23:29:50 +09:00
taizan-hokouto
48669e5f53 Fix tests 2020-11-16 23:29:24 +09:00
taizan-hokouto
7b0708ec46 Merge branch 'master' into develop 2020-11-16 23:17:37 +09:00
taizan-hokouto
f46df3ae42 Merge branch 'hotfix/json' 2020-11-16 23:17:36 +09:00
taizan-hokouto
96c028bd5d Increment version 2020-11-16 23:17:10 +09:00
taizan-hokouto
402dc15d7a Add tests 2020-11-16 23:11:51 +09:00
taizan-hokouto
6088ab6932 Fix jsonifying 2020-11-16 22:50:53 +09:00
taizan-hokouto
13812bdad3 Merge tag 'v0.4.5' into develop
v0.4.5
2020-11-16 01:50:50 +09:00
taizan-hokouto
d98d34d8b3 Merge branch 'release/v0.4.5' 2020-11-16 01:50:49 +09:00
taizan-hokouto
24fa104e84 Increment version 2020-11-16 01:50:25 +09:00
taizan-hokouto
b4dad8c641 Merge branch 'feature/archiver' into develop 2020-11-16 01:49:34 +09:00
taizan-hokouto
3550cd6d91 Use temporary file to reduce memory usage 2020-11-16 01:37:31 +09:00
taizan-hokouto
2815b48e0e Return filename 2020-11-16 01:36:59 +09:00
taizan-hokouto
650e6ccb65 Remove unnecessary lines 2020-11-16 01:17:10 +09:00
taizan-hokouto
4a00a19a43 Change argument name 2020-11-16 01:16:09 +09:00
taizan-hokouto
b067eda7b6 Separate modules 2020-11-16 01:15:36 +09:00
taizan-hokouto
1b6bc86e76 Fix handling exception 2020-11-15 23:49:36 +09:00
taizan-hokouto
da2b513bcc Reduce delay 2020-11-15 19:52:00 +09:00
taizan-hokouto
6adae578ef Return generator instead of list 2020-11-15 19:50:53 +09:00
taizan-hokuto
128a834841 Merge branch 'hotfix/fix' 2020-11-15 16:54:24 +09:00
taizan-hokuto
086a14115f Merge tag 'fix' into develop 2020-11-15 16:54:24 +09:00
taizan-hokuto
6a392f3e1a Increment version 2020-11-15 16:53:36 +09:00
taizan-hokuto
93127a703c Revert 2020-11-15 16:53:03 +09:00
taizan-hokuto
e4ddbaf8ae Merge branch 'develop' 2020-11-15 16:39:07 +09:00
taizan-hokuto
ec75058605 Merge pull request #22 from wakamezake/github_actions
Add GitHub actions
2020-11-15 16:05:13 +09:00
taizan-hokouto
2b62e5dc5e Merge branch 'feature/pr_22' into develop 2020-11-15 15:59:52 +09:00
taizan-hokouto
8d7874096e Fix datetime tests 2020-11-15 15:59:28 +09:00
taizan-hokouto
99fcab83c8 Revert 2020-11-15 15:49:39 +09:00
wakamezake
3027bc0579 change timezone utc to jst 2020-11-15 15:39:16 +09:00
wakamezake
b1b70a4e76 delete cache 2020-11-15 15:39:16 +09:00
wakamezake
de41341d84 typo 2020-11-15 15:39:16 +09:00
wakamezake
a03d43b081 version up 2020-11-15 15:39:16 +09:00
wakamezake
f60aaade7f init 2020-11-15 15:39:16 +09:00
wakamezake
d3c34086ff change timezone utc to jst 2020-11-15 11:29:12 +09:00
wakamezake
6b58c9bcf5 delete cache 2020-11-15 10:50:14 +09:00
wakamezake
c2cba1651e Merge remote-tracking branch 'upstream/master' into github_actions 2020-11-15 10:40:00 +09:00
taizan-hokouto
ada3eb437d Merge branch 'hotfix/test_requirements' 2020-11-15 09:22:38 +09:00
taizan-hokouto
c1517d5be8 Merge branch 'master' into develop 2020-11-15 09:22:38 +09:00
taizan-hokouto
351034d1e6 Increment version 2020-11-15 09:21:58 +09:00
taizan-hokouto
c1db5a0c47 Update requirements.txt and requirements_test.txt 2020-11-15 09:18:01 +09:00
wakamezake
088dce712a typo 2020-11-14 18:08:41 +09:00
wakamezake
425e880b09 version up 2020-11-14 18:07:30 +09:00
wakamezake
62ec78abee init 2020-11-14 18:04:49 +09:00
taizan-hokouto
c84a32682c Merge branch 'hotfix/fix_prompt' 2020-11-08 12:31:52 +09:00
taizan-hokouto
74277b2afe Merge branch 'master' into develop 2020-11-08 12:31:52 +09:00
taizan-hokouto
cd20b74b2a Increment version 2020-11-08 12:31:16 +09:00
taizan-hokouto
06f54fd985 Remove unnecessary console output 2020-11-08 12:30:40 +09:00
taizan-hokouto
98b0470703 Merge tag 'emoji' into develop
v0.4.1
2020-11-06 19:58:45 +09:00
taizan-hokouto
bb4113b53c Merge branch 'hotfix/emoji' 2020-11-06 19:58:44 +09:00
taizan-hokouto
07f4382ed4 Increment version 2020-11-06 19:57:16 +09:00
taizan-hokouto
d40720616b Fix emoji encoding 2020-11-06 19:56:54 +09:00
taizan-hokouto
eebe7c79bd Merge branch 'master' into develop 2020-11-05 22:19:11 +09:00
taizan-hokouto
6c9e327e36 Merge branch 'hotfix/fix_readme' 2020-11-05 22:19:11 +09:00
taizan-hokouto
e9161c0ddd Update README 2020-11-05 22:18:54 +09:00
taizan-hokouto
c8b75dcf0e Merge branch 'master' into develop 2020-11-05 00:14:50 +09:00
taizan-hokouto
30cb7d7043 Merge branch 'hotfix/fix_readme' 2020-11-05 00:14:50 +09:00
taizan-hokouto
19d5b74beb Update README 2020-11-05 00:14:36 +09:00
taizan-hokouto
d5c3e45edc Merge branch 'master' into develop 2020-11-03 20:21:53 +09:00
taizan-hokouto
1d479fc15c Merge branch 'hotfix/fix_readme' 2020-11-03 20:21:52 +09:00
taizan-hokouto
20a20ddd08 Update README 2020-11-03 20:21:39 +09:00
taizan-hokouto
00c239f974 Merge branch 'master' into develop 2020-11-03 20:10:48 +09:00
taizan-hokouto
67b766b32c Merge branch 'hotfix/fix_readme' 2020-11-03 20:10:48 +09:00
taizan-hokouto
249aa0d147 Update README 2020-11-03 20:10:34 +09:00
taizan-hokouto
c708a588d8 Merge tag 'v0.4.0' into develop
v0.4.0
2020-11-03 18:20:10 +09:00
18 changed files with 687 additions and 195 deletions

27
.github/workflows/run_test.yml vendored Normal file
View File

@@ -0,0 +1,27 @@
name: Run All UnitTest
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
strategy:
max-parallel: 4
matrix:
python-version: [3.7, 3.8]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt -r requirements_test.txt
- name: Test with pytest
run: |
export PYTHONPATH=./
pytest --verbose --color=yes

19
Pipfile Normal file
View File

@@ -0,0 +1,19 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
httpx = {extras = ["http2"], version = "*"}
protobuf = "==3.14.0"
pytz = "*"
urllib3 = "*"
[dev-packages]
pytest-mock = "*"
pytest-httpx = "*"
wheel = "*"
twine = "*"
[requires]
python_version = ">=3.6"

425
Pipfile.lock generated Normal file
View File

@@ -0,0 +1,425 @@
{
"_meta": {
"hash": {
"sha256": "aa731e6542f5f65756b98efe3e444ecffe78843c1041ab041cafdd2592c607db"
},
"pipfile-spec": 6,
"requires": {
"python_version": ">=3.6"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"certifi": {
"hashes": [
"sha256:1f422849db327d534e3d0c5f02a263458c3955ec0aae4ff09b95f195c59f4edd",
"sha256:f05def092c44fbf25834a51509ef6e631dc19765ab8a57b4e7ab85531f0a9cf4"
],
"version": "==2020.11.8"
},
"h11": {
"hashes": [
"sha256:3c6c61d69c6f13d41f1b80ab0322f1872702a3ba26e12aa864c928f6a43fbaab",
"sha256:ab6c335e1b6ef34b205d5ca3e228c9299cc7218b049819ec84a388c2525e5d87"
],
"version": "==0.11.0"
},
"h2": {
"hashes": [
"sha256:61e0f6601fa709f35cdb730863b4e5ec7ad449792add80d1410d4174ed139af5",
"sha256:875f41ebd6f2c44781259005b157faed1a5031df3ae5aa7bcb4628a6c0782f14"
],
"version": "==3.2.0"
},
"hpack": {
"hashes": [
"sha256:0edd79eda27a53ba5be2dfabf3b15780928a0dff6eb0c60a3d6767720e970c89",
"sha256:8eec9c1f4bfae3408a3f30500261f7e6a65912dc138526ea054f9ad98892e9d2"
],
"version": "==3.0.0"
},
"httpcore": {
"hashes": [
"sha256:37660b117ba9055e8d5d19c29684d2204bbd3150020dde0ebd2dd2bcf18dfe50",
"sha256:3c5fcd97c52c3f6a1e4d939d776458e6177b5c238b825ed51d72840e582573b5"
],
"markers": "python_version >= '3.6'",
"version": "==0.12.1"
},
"httpx": {
"extras": [
"http2"
],
"hashes": [
"sha256:126424c279c842738805974687e0518a94c7ae8d140cd65b9c4f77ac46ffa537",
"sha256:9cffb8ba31fac6536f2c8cde30df859013f59e4bcc5b8d43901cb3654a8e0a5b"
],
"index": "pypi",
"version": "==0.16.1"
},
"hyperframe": {
"hashes": [
"sha256:5187962cb16dcc078f23cb5a4b110098d546c3f41ff2d4038a9896893bbd0b40",
"sha256:a9f5c17f2cc3c719b917c4f33ed1c61bd1f8dfac4b1bd23b7c80b3400971b41f"
],
"version": "==5.2.0"
},
"idna": {
"hashes": [
"sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6",
"sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"
],
"version": "==2.10"
},
"protobuf": {
"hashes": [
"sha256:0e247612fadda953047f53301a7b0407cb0c3cb4ae25a6fde661597a04039b3c",
"sha256:0fc96785262042e4863b3f3b5c429d4636f10d90061e1840fce1baaf59b1a836",
"sha256:1c51fda1bbc9634246e7be6016d860be01747354ed7015ebe38acf4452f470d2",
"sha256:1d63eb389347293d8915fb47bee0951c7b5dab522a4a60118b9a18f33e21f8ce",
"sha256:22bcd2e284b3b1d969c12e84dc9b9a71701ec82d8ce975fdda19712e1cfd4e00",
"sha256:2a7e2fe101a7ace75e9327b9c946d247749e564a267b0515cf41dfe450b69bac",
"sha256:43b554b9e73a07ba84ed6cf25db0ff88b1e06be610b37656e292e3cbb5437472",
"sha256:4b74301b30513b1a7494d3055d95c714b560fbb630d8fb9956b6f27992c9f980",
"sha256:4e75105c9dfe13719b7293f75bd53033108f4ba03d44e71db0ec2a0e8401eafd",
"sha256:5b7a637212cc9b2bcf85dd828b1178d19efdf74dbfe1ddf8cd1b8e01fdaaa7f5",
"sha256:5e9806a43232a1fa0c9cf5da8dc06f6910d53e4390be1fa06f06454d888a9142",
"sha256:629b03fd3caae7f815b0c66b41273f6b1900a579e2ccb41ef4493a4f5fb84f3a",
"sha256:72230ed56f026dd664c21d73c5db73ebba50d924d7ba6b7c0d81a121e390406e",
"sha256:86a75477addde4918e9a1904e5c6af8d7b691f2a3f65587d73b16100fbe4c3b2",
"sha256:8971c421dbd7aad930c9bd2694122f332350b6ccb5202a8b7b06f3f1a5c41ed5",
"sha256:9616f0b65a30851e62f1713336c931fcd32c057202b7ff2cfbfca0fc7d5e3043",
"sha256:b0d5d35faeb07e22a1ddf8dce620860c8fe145426c02d1a0ae2688c6e8ede36d",
"sha256:ecc33531a213eee22ad60e0e2aaea6c8ba0021f0cce35dbf0ab03dee6e2a23a1"
],
"index": "pypi",
"version": "==3.14.0"
},
"pytz": {
"hashes": [
"sha256:3e6b7dd2d1e0a59084bcee14a17af60c5c562cdc16d828e8eba2e683d3a7e268",
"sha256:5c55e189b682d420be27c6995ba6edce0c0a77dd67bfbe2ae6607134d5851ffd"
],
"index": "pypi",
"version": "==2020.4"
},
"rfc3986": {
"extras": [
"idna2008"
],
"hashes": [
"sha256:112398da31a3344dc25dbf477d8df6cb34f9278a94fee2625d89e4514be8bb9d",
"sha256:af9147e9aceda37c91a05f4deb128d4b4b49d6b199775fd2d2927768abdc8f50"
],
"version": "==1.4.0"
},
"six": {
"hashes": [
"sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
"sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.15.0"
},
"sniffio": {
"hashes": [
"sha256:471b71698eac1c2112a40ce2752bb2f4a4814c22a54a3eed3676bc0f5ca9f663",
"sha256:c4666eecec1d3f50960c6bdf61ab7bc350648da6c126e3cf6898d8cd4ddcd3de"
],
"markers": "python_version >= '3.5'",
"version": "==1.2.0"
},
"urllib3": {
"hashes": [
"sha256:19188f96923873c92ccb987120ec4acaa12f0461fa9ce5d3d0772bc965a39e08",
"sha256:d8ff90d979214d7b4f8ce956e80f4028fc6860e4431f731ea4a8c08f23f99473"
],
"index": "pypi",
"version": "==1.26.2"
}
},
"develop": {
"atomicwrites": {
"hashes": [
"sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197",
"sha256:ae70396ad1a434f9c7046fd2dd196fc04b12f9e91ffb859164193be8b6168a7a"
],
"markers": "sys_platform == 'win32'",
"version": "==1.4.0"
},
"attrs": {
"hashes": [
"sha256:31b2eced602aa8423c2aea9c76a724617ed67cf9513173fd3a4f03e3a929c7e6",
"sha256:832aa3cde19744e49938b91fea06d69ecb9e649c93ba974535d08ad92164f700"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==20.3.0"
},
"bleach": {
"hashes": [
"sha256:52b5919b81842b1854196eaae5ca29679a2f2e378905c346d3ca8227c2c66080",
"sha256:9f8ccbeb6183c6e6cddea37592dfb0167485c1e3b13b3363bc325aa8bda3adbd"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==3.2.1"
},
"certifi": {
"hashes": [
"sha256:1f422849db327d534e3d0c5f02a263458c3955ec0aae4ff09b95f195c59f4edd",
"sha256:f05def092c44fbf25834a51509ef6e631dc19765ab8a57b4e7ab85531f0a9cf4"
],
"version": "==2020.11.8"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"colorama": {
"hashes": [
"sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b",
"sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"
],
"markers": "sys_platform == 'win32'",
"version": "==0.4.4"
},
"docutils": {
"hashes": [
"sha256:0c5b78adfbf7762415433f5515cd5c9e762339e23369dbe8000d84a4bf4ab3af",
"sha256:c2de3a60e9e7d07be26b7f2b00ca0309c207e06c100f9cc2a94931fc75a478fc"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==0.16"
},
"h11": {
"hashes": [
"sha256:3c6c61d69c6f13d41f1b80ab0322f1872702a3ba26e12aa864c928f6a43fbaab",
"sha256:ab6c335e1b6ef34b205d5ca3e228c9299cc7218b049819ec84a388c2525e5d87"
],
"version": "==0.11.0"
},
"httpcore": {
"hashes": [
"sha256:37660b117ba9055e8d5d19c29684d2204bbd3150020dde0ebd2dd2bcf18dfe50",
"sha256:3c5fcd97c52c3f6a1e4d939d776458e6177b5c238b825ed51d72840e582573b5"
],
"markers": "python_version >= '3.6'",
"version": "==0.12.1"
},
"httpx": {
"extras": [
"http2"
],
"hashes": [
"sha256:126424c279c842738805974687e0518a94c7ae8d140cd65b9c4f77ac46ffa537",
"sha256:9cffb8ba31fac6536f2c8cde30df859013f59e4bcc5b8d43901cb3654a8e0a5b"
],
"index": "pypi",
"version": "==0.16.1"
},
"idna": {
"hashes": [
"sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6",
"sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"
],
"version": "==2.10"
},
"iniconfig": {
"hashes": [
"sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3",
"sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"
],
"version": "==1.1.1"
},
"keyring": {
"hashes": [
"sha256:12de23258a95f3b13e5b167f7a641a878e91eab8ef16fafc077720a95e6115bb",
"sha256:207bd66f2a9881c835dad653da04e196c678bf104f8252141d2d3c4f31051579"
],
"markers": "python_version >= '3.6'",
"version": "==21.5.0"
},
"packaging": {
"hashes": [
"sha256:4357f74f47b9c12db93624a82154e9b120fa8293699949152b22065d556079f8",
"sha256:998416ba6962ae7fbd6596850b80e17859a5753ba17c32284f67bfff33784181"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==20.4"
},
"pkginfo": {
"hashes": [
"sha256:a6a4ac943b496745cec21f14f021bbd869d5e9b4f6ec06918cffea5a2f4b9193",
"sha256:ce14d7296c673dc4c61c759a0b6c14bae34e34eb819c0017bb6ca5b7292c56e9"
],
"version": "==1.6.1"
},
"pluggy": {
"hashes": [
"sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==0.13.1"
},
"py": {
"hashes": [
"sha256:366389d1db726cd2fcfc79732e75410e5fe4d31db13692115529d34069a043c2",
"sha256:9ca6883ce56b4e8da7e79ac18787889fa5206c79dcc67fb065376cd2fe03f342"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.9.0"
},
"pygments": {
"hashes": [
"sha256:381985fcc551eb9d37c52088a32914e00517e57f4a21609f48141ba08e193fa0",
"sha256:88a0bbcd659fcb9573703957c6b9cff9fab7295e6e76db54c9d00ae42df32773"
],
"markers": "python_version >= '3.5'",
"version": "==2.7.2"
},
"pyparsing": {
"hashes": [
"sha256:c203ec8783bf771a155b207279b9bccb8dea02d8f0c9e5f8ead507bc3246ecc1",
"sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b"
],
"markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==2.4.7"
},
"pytest": {
"hashes": [
"sha256:4288fed0d9153d9646bfcdf0c0428197dba1ecb27a33bb6e031d002fa88653fe",
"sha256:c0a7e94a8cdbc5422a51ccdad8e6f1024795939cc89159a0ae7f0b316ad3823e"
],
"markers": "python_version >= '3.5'",
"version": "==6.1.2"
},
"pytest-httpx": {
"hashes": [
"sha256:1cee873fdad622ca21169105691607db1411c9927aae9c2f44c02a893977c8f3",
"sha256:b996c8a4be900dfd37746d438cc9fc9321d37ffcacc1f5b7a9fc391daa208456"
],
"index": "pypi",
"version": "==0.10.0"
},
"pytest-mock": {
"hashes": [
"sha256:024e405ad382646318c4281948aadf6fe1135632bea9cc67366ea0c4098ef5f2",
"sha256:a4d6d37329e4a893e77d9ffa89e838dd2b45d5dc099984cf03c703ac8411bb82"
],
"index": "pypi",
"version": "==3.3.1"
},
"pywin32-ctypes": {
"hashes": [
"sha256:24ffc3b341d457d48e8922352130cf2644024a4ff09762a2261fd34c36ee5942",
"sha256:9dc2d991b3479cc2df15930958b674a48a227d5361d413827a4cfd0b5876fc98"
],
"markers": "sys_platform == 'win32'",
"version": "==0.2.0"
},
"readme-renderer": {
"hashes": [
"sha256:267854ac3b1530633c2394ead828afcd060fc273217c42ac36b6be9c42cd9a9d",
"sha256:6b7e5aa59210a40de72eb79931491eaf46fefca2952b9181268bd7c7c65c260a"
],
"version": "==28.0"
},
"requests": {
"hashes": [
"sha256:7f1a0b932f4a60a1a65caa4263921bb7d9ee911957e0ae4a23a6dd08185ad5f8",
"sha256:e786fa28d8c9154e6a4de5d46a1d921b8749f8b74e28bde23768e5e16eece998"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==2.25.0"
},
"requests-toolbelt": {
"hashes": [
"sha256:380606e1d10dc85c3bd47bf5a6095f815ec007be7a8b69c878507068df059e6f",
"sha256:968089d4584ad4ad7c171454f0a5c6dac23971e9472521ea3b6d49d610aa6fc0"
],
"version": "==0.9.1"
},
"rfc3986": {
"extras": [
"idna2008"
],
"hashes": [
"sha256:112398da31a3344dc25dbf477d8df6cb34f9278a94fee2625d89e4514be8bb9d",
"sha256:af9147e9aceda37c91a05f4deb128d4b4b49d6b199775fd2d2927768abdc8f50"
],
"version": "==1.4.0"
},
"six": {
"hashes": [
"sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
"sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.15.0"
},
"sniffio": {
"hashes": [
"sha256:471b71698eac1c2112a40ce2752bb2f4a4814c22a54a3eed3676bc0f5ca9f663",
"sha256:c4666eecec1d3f50960c6bdf61ab7bc350648da6c126e3cf6898d8cd4ddcd3de"
],
"markers": "python_version >= '3.5'",
"version": "==1.2.0"
},
"toml": {
"hashes": [
"sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b",
"sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"
],
"markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==0.10.2"
},
"tqdm": {
"hashes": [
"sha256:18d6a615aedd09ec8456d9524489dab330af4bd5c2a14a76eb3f9a0e14471afe",
"sha256:80d9d5165d678dbd027dd102dfb99f71bf05f333b61fb761dbba13b4ab719ead"
],
"markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==4.52.0"
},
"twine": {
"hashes": [
"sha256:34352fd52ec3b9d29837e6072d5a2a7c6fe4290e97bba46bb8d478b5c598f7ab",
"sha256:ba9ff477b8d6de0c89dd450e70b2185da190514e91c42cc62f96850025c10472"
],
"index": "pypi",
"version": "==3.2.0"
},
"urllib3": {
"hashes": [
"sha256:19188f96923873c92ccb987120ec4acaa12f0461fa9ce5d3d0772bc965a39e08",
"sha256:d8ff90d979214d7b4f8ce956e80f4028fc6860e4431f731ea4a8c08f23f99473"
],
"index": "pypi",
"version": "==1.26.2"
},
"webencodings": {
"hashes": [
"sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78",
"sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923"
],
"version": "==0.5.1"
},
"wheel": {
"hashes": [
"sha256:497add53525d16c173c2c1c733b8f655510e909ea78cc0e29d374243544b77a2",
"sha256:99a22d87add3f634ff917310a3d87e499f19e663413a52eb9232c447aa646c9f"
],
"index": "pypi",
"version": "==0.35.1"
}
}
}

View File

@@ -24,12 +24,14 @@ pip install pytchat
### CLI
One-liner command.
+ One-liner command.
+ Save chat data to html with embedded custom emojis.
+ Show chat stream (--echo option).
Save chat data to html with embedded custom emojis.
Show chat stream (--echo option).
```bash
$ pytchat -v https://www.youtube.com/watch?v=uIx8l2xlYVY -o "c:/temp/"
$ pytchat -v uIx8l2xlYVY -o "c:/temp/"
# options:
# -v : Video ID or URL that includes ID
# -o : output directory (default path: './')
@@ -38,7 +40,7 @@ $ pytchat -v https://www.youtube.com/watch?v=uIx8l2xlYVY -o "c:/temp/"
```
### On-demand mode with simple non-buffered object.
### Fetch chat data (see [wiki](https://github.com/taizan-hokuto/pytchat/wiki/PytchatCore))
```python
import pytchat
chat = pytchat.create(video_id="uIx8l2xlYVY")
@@ -47,7 +49,8 @@ while chat.is_alive():
print(f"{c.datetime} [{c.author.name}]- {c.message}")
```
### Output JSON format (feature of [DefaultProcessor](DefaultProcessor))
### Output JSON format string (feature of [DefaultProcessor](https://github.com/taizan-hokuto/pytchat/wiki/DefaultProcessor))
```python
import pytchat
import time
@@ -58,35 +61,21 @@ while chat.is_alive():
time.sleep(5)
'''
# Each chat item can also be output in JSON format.
for c in chat.get().sync_items():
for c in chat.get().items:
print(c.json())
'''
```
### other
#### Fetch chat with buffer.
[LiveChat](https://github.com/taizan-hokuto/pytchat/wiki/LiveChat)
+ Fetch chat with a buffer ([LiveChat](https://github.com/taizan-hokuto/pytchat/wiki/LiveChat))
#### Asyncio Context
[LiveChatAsync](https://github.com/taizan-hokuto/pytchat/wiki/LiveChatAsync)
+ Use with asyncio ([LiveChatAsync](https://github.com/taizan-hokuto/pytchat/wiki/LiveChatAsync))
#### [YT API compatible chat processor]https://github.com/taizan-hokuto/pytchat/wiki/CompatibleProcessor)
+ YT API compatible chat processor ([CompatibleProcessor](https://github.com/taizan-hokuto/pytchat/wiki/CompatibleProcessor))
### [Extract archived chat data](https://github.com/taizan-hokuto/pytchat/wiki/Extractor)
```python
from pytchat import HTMLArchiver, Extractor
+ Extract archived chat data ([Extractor](https://github.com/taizan-hokuto/pytchat/wiki/Extractor))
video_id = "*******"
ex = Extractor(
video_id,
div=10,
processor=HTMLArchiver("c:/test.html")
)
ex.extract()
print("finished.")
```
## Structure of Default Processor
Each item can be got with `sync_items()` function.

View File

@@ -1,8 +1,8 @@
"""
pytchat is a lightweight python library to browse youtube livechat without Selenium or BeautifulSoup.
"""
__copyright__ = 'Copyright (C) 2019 taizan-hokuto'
__version__ = '0.4.0'
__copyright__ = 'Copyright (C) 2019, 2020 taizan-hokuto'
__version__ = '0.4.7'
__license__ = 'MIT'
__author__ = 'taizan-hokuto'
__author_email__ = '55448286+taizan-hokuto@users.noreply.github.com'

View File

@@ -1,31 +1,21 @@
import argparse
import asyncio
try:
from asyncio import CancelledError
except ImportError:
from asyncio.futures import CancelledError
import os
import signal
from json.decoder import JSONDecodeError
from pathlib import Path
from httpcore import ReadTimeout as HCReadTimeout, NetworkError as HCNetworkError
from .arguments import Arguments
from .echo import Echo
from .progressbar import ProgressBar
from .. exceptions import InvalidVideoIdException, NoContents, PatternUnmatchError, UnknownConnectionError
from .. processors.html_archiver import HTMLArchiver
from .. tool.extract.extractor import Extractor
from .. tool.videoinfo import VideoInfo
from .. util.extract_video_id import extract_video_id
from .. import util
from .. exceptions import InvalidVideoIdException
from .. import __version__
from .cli_extractor import CLIExtractor
'''
Most of CLI modules refer to
Petter Kraabøl's Twitch-Chat-Downloader
https://github.com/PetterKraabol/Twitch-Chat-Downloader
(MIT License)
'''
@@ -38,20 +28,19 @@ def main():
'If ID starts with a hyphen (-), enclose the ID in square brackets.')
parser.add_argument('-o', f'--{Arguments.Name.OUTPUT}', type=str,
help='Output directory (end with "/"). default="./"', default='./')
parser.add_argument(f'--{Arguments.Name.SAVE_ERROR_DATA}', action='store_true',
help='Save error data when error occurs(".dat" file)')
parser.add_argument(f'--{Arguments.Name.DEBUG}', action='store_true',
help='Debug mode. Stop when exceptions have occurred and save error data (".dat" file).')
parser.add_argument(f'--{Arguments.Name.VERSION}', action='store_true',
help='Show version')
help='Show version.')
parser.add_argument(f'--{Arguments.Name.ECHO}', action='store_true',
help='Show chats of specified video')
help='Display chats of specified video.')
Arguments(parser.parse_args().__dict__)
if Arguments().print_version:
print(f'pytchat v{__version__} © 2019,2020 taizan-hokuto')
print(f'pytchat v{__version__} © 2019, 2020 taizan-hokuto')
return
# Extractor
if not Arguments().video_ids:
parser.print_help()
return
@@ -59,7 +48,7 @@ def main():
# Echo
if Arguments().echo:
if len(Arguments().video_ids) > 1:
print("You can specify only one video ID.")
print("When using --echo option, only one video ID can be specified.")
return
try:
Echo(Arguments().video_ids[0]).run()
@@ -67,111 +56,16 @@ def main():
print("Invalid video id:", str(e))
except Exception as e:
print(type(e), str(e))
if Arguments().debug:
raise
finally:
return
# Extractor
if not os.path.exists(Arguments().output):
print("\nThe specified directory does not exist.:{}\n".format(Arguments().output))
return
try:
Runner().run()
CLIExtractor().run()
except CancelledError as e:
print(str(e))
class Runner:
def run(self) -> None:
ex = None
pbar = None
for counter, video_id in enumerate(Arguments().video_ids):
if len(Arguments().video_ids) > 1:
print(f"\n{'-' * 10} video:{counter + 1} of {len(Arguments().video_ids)} {'-' * 10}")
try:
video_id = extract_video_id(video_id)
separated_path = str(Path(Arguments().output)) + os.path.sep
path = util.checkpath(separated_path + video_id + '.html')
try:
info = VideoInfo(video_id)
except (PatternUnmatchError, JSONDecodeError) as e:
print("Cannot parse video information.:{} {}".format(video_id, type(e)))
if Arguments().save_error_data:
util.save(str(e.doc), "ERR", ".dat")
continue
except Exception as e:
print("Cannot parse video information.:{} {}".format(video_id, type(e)))
continue
print(f"\n"
f" video_id: {video_id}\n"
f" channel: {info.get_channel_name()}\n"
f" title: {info.get_title()}\n"
f" output path: {path}")
duration = info.get_duration()
pbar = ProgressBar(total=(duration * 1000), status_txt="Extracting")
ex = Extractor(video_id,
callback=pbar.disp,
div=10)
signal.signal(signal.SIGINT, (lambda a, b: self.cancel(ex, pbar)))
data = ex.extract()
if data == []:
continue
pbar.reset("#", "=", total=len(data), status_txt="Rendering ")
processor = HTMLArchiver(path, callback=pbar.disp)
processor.process(
[{'video_id': None,
'timeout': 1,
'chatdata': (action["replayChatItemAction"]["actions"][0] for action in data)}]
)
processor.finalize()
pbar.reset('#', '#', status_txt='Completed ')
pbar.close()
print()
if pbar.is_cancelled():
print("\nThe extraction process has been discontinued.\n")
except InvalidVideoIdException:
print("Invalid Video ID or URL:", video_id)
except NoContents as e:
print(f"Abort:{str(e)}:[{video_id}]")
except (JSONDecodeError, PatternUnmatchError) as e:
print("{}:{}".format(e.msg, video_id))
if Arguments().save_error_data:
util.save(e.doc, "ERR_", ".dat")
except (UnknownConnectionError, HCNetworkError, HCReadTimeout) as e:
print(f"An unknown network error occurred during the processing of [{video_id}]. : " + str(e))
except Exception as e:
print(f"Abort:{str(type(e))} {str(e)[:80]}")
finally:
clear_tasks()
return
def cancel(self, ex=None, pbar=None) -> None:
'''Called when keyboard interrupted has occurred.
'''
print("\nKeyboard interrupted.\n")
if ex and pbar:
ex.cancel()
pbar.cancel()
def clear_tasks():
'''
Clear remained tasks.
Called when internal exception has occurred or
after each extraction process is completed.
'''
async def _shutdown():
tasks = [t for t in asyncio.all_tasks()
if t is not asyncio.current_task()]
for task in tasks:
task.cancel()
try:
loop = asyncio.get_event_loop()
loop.run_until_complete(_shutdown())
except Exception as e:
print(e)

View File

@@ -18,7 +18,7 @@ class Arguments(metaclass=Singleton):
VERSION: str = 'version'
OUTPUT: str = 'output_dir'
VIDEO_IDS: str = 'video_id'
SAVE_ERROR_DATA: bool = 'save_error_data'
DEBUG: bool = 'debug'
ECHO: bool = 'echo'
def __init__(self,
@@ -36,10 +36,10 @@ class Arguments(metaclass=Singleton):
self.print_version: bool = arguments[Arguments.Name.VERSION]
self.output: str = arguments[Arguments.Name.OUTPUT]
self.video_ids: List[int] = []
self.save_error_data: bool = arguments[Arguments.Name.SAVE_ERROR_DATA]
self.debug: bool = arguments[Arguments.Name.DEBUG]
self.echo: bool = arguments[Arguments.Name.ECHO]
# Videos
if arguments[Arguments.Name.VIDEO_IDS]:
self.video_ids = [video_id
for video_id in arguments[Arguments.Name.VIDEO_IDS].split(',')]

View File

@@ -0,0 +1,121 @@
import asyncio
import os
import signal
import traceback
from httpcore import ReadTimeout as HCReadTimeout, NetworkError as HCNetworkError
from json.decoder import JSONDecodeError
from pathlib import Path
from .arguments import Arguments
from .progressbar import ProgressBar
from .. import util
from .. exceptions import InvalidVideoIdException, NoContents, PatternUnmatchError, UnknownConnectionError
from .. processors.html_archiver import HTMLArchiver
from .. tool.extract.extractor import Extractor
from .. tool.videoinfo import VideoInfo
from .. util.extract_video_id import extract_video_id
class CLIExtractor:
def run(self) -> None:
ex = None
pbar = None
for counter, video_id in enumerate(Arguments().video_ids):
if len(Arguments().video_ids) > 1:
print(f"\n{'-' * 10} video:{counter + 1} of {len(Arguments().video_ids)} {'-' * 10}")
try:
video_id = extract_video_id(video_id)
separated_path = str(Path(Arguments().output)) + os.path.sep
path = util.checkpath(separated_path + video_id + '.html')
try:
info = VideoInfo(video_id)
except (PatternUnmatchError, JSONDecodeError) as e:
print("Cannot parse video information.:{} {}".format(video_id, type(e)))
if Arguments().debug:
util.save(str(e.doc), "ERR", ".dat")
continue
except Exception as e:
print("Cannot parse video information.:{} {}".format(video_id, type(e)))
continue
print(f"\n"
f" video_id: {video_id}\n"
f" channel: {info.get_channel_name()}\n"
f" title: {info.get_title()}\n"
f" output path: {path}")
duration = info.get_duration()
pbar = ProgressBar(total=(duration * 1000), status_txt="Extracting")
ex = Extractor(video_id,
callback=pbar.disp,
div=10)
signal.signal(signal.SIGINT, (lambda a, b: self.cancel(ex, pbar)))
data = ex.extract()
if data == [] or data is None:
continue
pbar.reset("#", "=", total=1000, status_txt="Rendering ")
processor = HTMLArchiver(path, callback=pbar.disp)
processor.process(
[{'video_id': None,
'timeout': 1,
'chatdata': (action["replayChatItemAction"]["actions"][0] for action in data)}]
)
processor.finalize()
pbar.reset('#', '#', status_txt='Completed ')
pbar.close()
print()
if pbar.is_cancelled():
print("\nThe extraction process has been discontinued.\n")
except InvalidVideoIdException:
print("Invalid Video ID or URL:", video_id)
except NoContents as e:
print(f"Abort:{str(e)}:[{video_id}]")
except (JSONDecodeError, PatternUnmatchError) as e:
print("{}:{}".format(e.msg, video_id))
if Arguments().debug:
filename = util.save(e.doc, "ERR_", ".dat")
traceback.print_exc()
print(f"Saved error data: {filename}")
except (UnknownConnectionError, HCNetworkError, HCReadTimeout) as e:
if Arguments().debug:
traceback.print_exc()
print(f"An unknown network error occurred during the processing of [{video_id}]. : " + str(e))
except Exception as e:
print(f"Abort:{str(type(e))} {str(e)[:80]}")
if Arguments().debug:
traceback.print_exc()
finally:
clear_tasks()
return
def cancel(self, ex=None, pbar=None) -> None:
'''Called when keyboard interrupted has occurred.
'''
print("\nKeyboard interrupted.\n")
if ex and pbar:
ex.cancel()
pbar.cancel()
def clear_tasks():
'''
Clear remained tasks.
Called when internal exception has occurred or
after each extraction process is completed.
'''
async def _shutdown():
tasks = [t for t in asyncio.all_tasks()
if t is not asyncio.current_task()]
for task in tasks:
task.cancel()
try:
loop = asyncio.get_event_loop()
loop.run_until_complete(_shutdown())
except Exception as e:
print(str(e))
if Arguments().debug:
traceback.print_exc()

View File

@@ -118,13 +118,10 @@ class PytchatCore:
except exceptions.ChatParseException as e:
self._logger.debug(f"[{self._video_id}]{str(e)}")
self._raise_exception(e)
except (TypeError, json.JSONDecodeError) as e:
except Exception as e:
self._logger.error(f"{traceback.format_exc(limit=-1)}")
self._raise_exception(e)
self._logger.debug(f"[{self._video_id}]finished fetching chat.")
self._raise_exception(exceptions.ChatDataFinished)
def _get_contents(self, continuation, client, headers):
'''Get 'continuationContents' from livechat json.
If contents is None at first fetching,
@@ -201,7 +198,7 @@ class PytchatCore:
raise self._exception_holder
def _raise_exception(self, exception: Exception = None):
self._is_alive = False
self.terminate()
if self._hold_exception is False:
raise exception
self._exception_holder = exception

View File

@@ -186,12 +186,12 @@ class LiveChatAsync:
except exceptions.ChatParseException as e:
self._logger.debug(f"[{self._video_id}]{str(e)}")
raise
except (TypeError, json.JSONDecodeError):
except Exception:
self._logger.error(f"{traceback.format_exc(limit = -1)}")
raise
self._logger.debug(f"[{self._video_id}] finished fetching chat.")
raise exceptions.ChatDataFinished
async def _check_pause(self, continuation):
if self._pauser.empty():

View File

@@ -179,12 +179,12 @@ class LiveChat:
except exceptions.ChatParseException as e:
self._logger.debug(f"[{self._video_id}]{str(e)}")
raise
except (TypeError, json.JSONDecodeError):
except Exception:
self._logger.error(f"{traceback.format_exc(limit=-1)}")
raise
self._logger.debug(f"[{self._video_id}] finished fetching chat.")
raise exceptions.ChatDataFinished
def _check_pause(self, continuation):
if self._pauser.empty():

View File

@@ -112,7 +112,7 @@ class Chatdata:
await asyncio.sleep(1 - stop_interval)
def json(self) -> str:
return json.dumps([vars(a) for a in self.items], ensure_ascii=False, cls=CustomEncoder)
return ''.join(("[", ','.join((a.json() for a in self.items)), "]"))
class DefaultProcessor(ChatProcessor):
@@ -137,7 +137,7 @@ class DefaultProcessor(ChatProcessor):
if component is None:
continue
timeout += component.get('timeout', 0)
chatdata = component.get('chatdata')
chatdata = component.get('chatdata') # if from Extractor, chatdata is generator.
if chatdata is None:
continue
for action in chatdata:
@@ -153,7 +153,7 @@ class DefaultProcessor(ChatProcessor):
chatlist.append(chat)
if self.first and chatlist:
self.abs_diff = time.time() - chatlist[0].timestamp / 1000 + 2
self.abs_diff = time.time() - chatlist[0].timestamp / 1000
self.first = False
chatdata = Chatdata(chatlist, float(timeout), self.abs_diff)

View File

@@ -7,7 +7,7 @@ from concurrent.futures import ThreadPoolExecutor
from .chat_processor import ChatProcessor
from .default.processor import DefaultProcessor
from ..exceptions import UnknownConnectionError
import tempfile
PATTERN = re.compile(r"(.*)\(([0-9]+)\)$")
@@ -51,11 +51,12 @@ class HTMLArchiver(ChatProcessor):
self.client = httpx.Client(http2=True)
self.save_path = self._checkpath(save_path)
self.processor = DefaultProcessor()
self.emoji_table = {} # tuble for custom emojis. key: emoji_id, value: base64 encoded image binary.
self.header = [HEADER_HTML]
self.body = ['<body>\n', '<table class="css">\n', self._parse_table_header(fmt_headers)]
self.emoji_table = {} # dict for custom emojis. key: emoji_id, value: base64 encoded image binary.
self.callback = callback
self.executor = ThreadPoolExecutor(max_workers=10)
self.tmp_fp = tempfile.NamedTemporaryFile(mode="a", encoding="utf-8", delete=False)
self.tmp_filename = self.tmp_fp.name
self.counter = 0
def _checkpath(self, filepath):
splitter = os.path.splitext(os.path.basename(filepath))
@@ -85,9 +86,9 @@ class HTMLArchiver(ChatProcessor):
Count of total lines written to the file.
"""
if chat_components is None or len(chat_components) == 0:
return
return self.save_path ,self.counter
for c in self.processor.process(chat_components).items:
self.body.extend(
self.tmp_fp.write(
self._parse_html_line((
c.datetime,
c.elapsedTime,
@@ -100,6 +101,8 @@ class HTMLArchiver(ChatProcessor):
)
if self.callback:
self.callback(None, 1)
self.counter += 1
return self.save_path, self.counter
def _parse_html_line(self, raw_line):
return ''.join(('<tr>',
@@ -123,7 +126,6 @@ class HTMLArchiver(ChatProcessor):
resp = self.client.get(url, timeout=30)
break
except httpx.HTTPError as e:
print("Network Error. retrying...")
err = e
time.sleep(3)
else:
@@ -132,7 +134,7 @@ class HTMLArchiver(ChatProcessor):
return standard_b64encode(resp.content).decode()
def _set_emoji_table(self, item: dict):
emoji_id = item['id']
emoji_id = ''.join(('Z', item['id'])) if 48 <= ord(item['id'][0]) <= 57 else item['id']
if emoji_id not in self.emoji_table:
self.emoji_table.setdefault(emoji_id, self.executor.submit(self._encode_img, item['url']))
return emoji_id
@@ -150,9 +152,19 @@ class HTMLArchiver(ChatProcessor):
'</style>\n'))
def finalize(self):
self.executor.shutdown()
self.header.extend([self._create_styles(), '</head>\n'])
self.body.extend(['</table>\n</body>\n</html>'])
with open(self.save_path, mode='a', encoding='utf-8') as f:
f.writelines(self.header)
f.writelines(self.body)
if self.tmp_fp:
self.tmp_fp.flush()
self.tmp_fp = None
with open(self.save_path, mode='w', encoding='utf-8') as outfile:
# write header
outfile.writelines((
HEADER_HTML, self._create_styles(), '</head>\n',
'<body>\n', '<table class="css">\n',
self._parse_table_header(fmt_headers)))
# write body
fp = open(self.tmp_filename, mode="r", encoding="utf-8")
for line in fp:
outfile.write(line)
outfile.write('</table>\n</body>\n</html>')
fp.close()
os.remove(self.tmp_filename)

View File

@@ -1,3 +1,4 @@
from typing import Generator
from . import asyncdl
from . import duplcheck
from .. videoinfo import VideoInfo
@@ -60,11 +61,10 @@ class Extractor:
self.blocks = duplcheck.remove_duplicate_tail(self.blocks)
return self
def _combine(self):
ret = []
def _get_chatdata(self) -> Generator:
for block in self.blocks:
ret.extend(block.chat_data)
return ret
for chatdata in block.chat_data:
yield chatdata
def _execute_extract_operations(self):
return (
@@ -74,7 +74,7 @@ class Extractor:
._remove_overlap()
._download_blocks()
._remove_duplicate_tail()
._combine()
._get_chatdata()
)
def extract(self):

View File

@@ -16,10 +16,11 @@ def extract(url):
json.dump(html.json(), f, ensure_ascii=False)
def save(data, filename, extention):
with open(filename + "_" + (datetime.datetime.now().strftime('%Y-%m-%d %H-%M-%S')) + extention,
mode='w', encoding='utf-8') as f:
def save(data, filename, extention) -> str:
save_filename = filename + "_" + (datetime.datetime.now().strftime('%Y-%m-%d %H-%M-%S')) + extention
with open(save_filename ,mode='w', encoding='utf-8') as f:
f.writelines(data)
return save_filename
def checkpath(filepath):

View File

@@ -1,4 +1,4 @@
httpx[http2]==0.14.1
protobuf==3.13.0
httpx[http2]
protobuf==3.14.0
pytz
urllib3

View File

@@ -1,4 +1,2 @@
mock
mocker
pytest
pytest_httpx
pytest-mock
pytest-httpx

View File

@@ -1,8 +1,17 @@
import json
from datetime import datetime
from pytchat.parser.live import Parser
from pytchat.processors.default.processor import DefaultProcessor
TEST_TIMETSTAMP = 1570678496000000
def get_local_datetime(timestamp):
dt = datetime.fromtimestamp(timestamp / 1000000)
return dt.strftime('%Y-%m-%d %H:%M:%S')
def test_textmessage(mocker):
'''text message'''
processor = DefaultProcessor()
@@ -20,7 +29,7 @@ def test_textmessage(mocker):
assert ret.id == "dummy_id"
assert ret.message == "dummy_message"
assert ret.timestamp == 1570678496000
assert ret.datetime == "2019-10-10 12:34:56"
assert ret.datetime == get_local_datetime(TEST_TIMETSTAMP)
assert ret.author.name == "author_name"
assert ret.author.channelId == "author_channel_id"
assert ret.author.channelUrl == "http://www.youtube.com/channel/author_channel_id"
@@ -51,7 +60,7 @@ def test_textmessage_replay_member(mocker):
assert ret.message == "dummy_message"
assert ret.messageEx == ["dummy_message"]
assert ret.timestamp == 1570678496000
assert ret.datetime == "2019-10-10 12:34:56"
assert ret.datetime == get_local_datetime(TEST_TIMETSTAMP)
assert ret.elapsedTime == "1:23:45"
assert ret.author.name == "author_name"
assert ret.author.channelId == "author_channel_id"
@@ -83,7 +92,7 @@ def test_superchat(mocker):
assert ret.message == "dummy_message"
assert ret.messageEx == ["dummy_message"]
assert ret.timestamp == 1570678496000
assert ret.datetime == "2019-10-10 12:34:56"
assert ret.datetime == get_local_datetime(TEST_TIMETSTAMP)
assert ret.elapsedTime == ""
assert ret.amountValue == 800
assert ret.amountString == "¥800"
@@ -125,7 +134,7 @@ def test_supersticker(mocker):
assert ret.message == ""
assert ret.messageEx == []
assert ret.timestamp == 1570678496000
assert ret.datetime == "2019-10-10 12:34:56"
assert ret.datetime == get_local_datetime(TEST_TIMETSTAMP)
assert ret.elapsedTime == ""
assert ret.amountValue == 200
assert ret.amountString == "¥200"
@@ -166,7 +175,7 @@ def test_sponsor(mocker):
assert ret.message == "新規メンバー"
assert ret.messageEx == ["新規メンバー"]
assert ret.timestamp == 1570678496000
assert ret.datetime == "2019-10-10 12:34:56"
assert ret.datetime == get_local_datetime(TEST_TIMETSTAMP)
assert ret.elapsedTime == ""
assert ret.bgColor == 0
assert ret.author.name == "author_name"
@@ -199,7 +208,7 @@ def test_sponsor_legacy(mocker):
assert ret.message == "新規メンバー / ようこそ、author_name"
assert ret.messageEx == ["新規メンバー / ようこそ、author_name"]
assert ret.timestamp == 1570678496000
assert ret.datetime == "2019-10-10 12:34:56"
assert ret.datetime == get_local_datetime(TEST_TIMETSTAMP)
assert ret.elapsedTime == ""
assert ret.bgColor == 0
assert ret.author.name == "author_name"