Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error #245

Open
davinc123 opened this issue Nov 13, 2024 · 9 comments
Open

error #245

davinc123 opened this issue Nov 13, 2024 · 9 comments

Comments

@davinc123
Copy link

500 (INTERNAL SERVER ERROR): 500 Internal Server Error: The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.

Traceback (most recent call last):
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/flask/app.py", line 2051, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/flask/app.py", line 1501, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/flask/app.py", line 1499, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/flask/app.py", line 1485, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/flask/views.py", line 83, in view
    return self.dispatch_request(*args, **kwargs)
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/scrapydweb/views/files/log.py", line 167, in dispatch_request
    self.update_kwargs()
  File "/home/ubuntu/miniconda3/envs/py310/lib/python3.10/site-packages/scrapydweb/views/files/log.py", line 359, in update_kwargs
    for d in self.stats['datas']:
KeyError: 'datas'

  • OS: Linux-6.8.0-1018-aws-x86_64-with-glibc2.39
  • Python: 3.10.14
  • ScrapydWeb: 1.4.0
  • LogParser: 0.8.2
  • Scrapyd servers amount: 1
  • User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Safari/537.36
  • Request Method: GET
  • Request Args: ImmutableMultiDict([('job_finished', 'True')])
  • Form Data: ImmutableMultiDict([])
  • Files Data: ImmutableMultiDict([])
  • Request Referer: http://localhost:15000/1/jobs/
@my8100
Copy link
Owner

my8100 commented Nov 13, 2024

Looks like the json file is broken, please try to find out and rename the file.

logs/football_fotmob/fotmob_league/2024-11-13T12_10_31.json

# Enter the directory when you run Scrapyd, run the command below
# to find out where the Scrapy logs are stored:
# python -c "from os.path import abspath, isdir; from scrapyd.config import Config; path = abspath(Config().get('logs_dir')); print(path); print(isdir(path))"
# Check out https://scrapyd.readthedocs.io/en/stable/config.html#logs-dir for more info.
# e.g. 'C:/Users/username/logs' or '/home/username/logs'
LOCAL_SCRAPYD_LOGS_DIR = ''

@davinc123
Copy link
Author

However, every spider has this kind of json file corruption. And my attempts to modify LOCAL_SCRAPYD_LOGS_DIR have not solved the problem. Is there any other way around this,scrapy version is 2.1.0

@my8100
Copy link
Owner

my8100 commented Nov 14, 2024

scrapy 2.1.0 was released in 2020.
Can you upgrade your scrapy?

https://pypi.org/project/Scrapy/#history

@my8100 my8100 mentioned this issue Nov 14, 2024
@my8100 my8100 closed this as completed Nov 14, 2024
@my8100
Copy link
Owner

my8100 commented Nov 14, 2024

What is the version of your scrapyd?
Can you upgrade it and restart your spider?

https://pypi.org/project/scrapyd/#history

@my8100 my8100 reopened this Nov 14, 2024
@davinc123
Copy link
Author

What is the version of your scrapyd? Can you upgrade it and restart your spider?

https://pypi.org/project/scrapyd/#history

What version of scrapyd do you use, please

@my8100
Copy link
Owner

my8100 commented Nov 14, 2024

Try scrapyd v1.4.3 and the latest scrapydweb.

@davinc123
Copy link
Author

davinc123 commented Nov 14, 2024

Try scrapyd v1.4.3 and the latest scrapydweb.
aaa

@my8100
Copy link
Owner

my8100 commented Nov 14, 2024

The log of scrapyd is not found.
Run the spider again.

@davinc123
Copy link
Author

The log of scrapyd is not found. Run the spider again.
I fixed it. The problem was that I set log to error in the scrapy settings file, which caused a series of errors. Thank you for your answer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants