Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • D DeepPavlov
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 18
    • Issues 18
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 22
    • Merge requests 22
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • DeepPavlov
  • DeepPavlov
  • Issues
  • #719
Closed
Open
Issue created Feb 20, 2019 by Andrei Glinskii@glinskii.avDeveloper

sklearn_component crashed when a large (ovr 4GiB) model saving

Created by: ismaslov

  File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/home/user/.virtualenvs/support/lib/python3.6/site-packages/deeppavlov/__main__.py", line 3, in <module>
    main()
  File "/home/user/.virtualenvs/support/lib/python3.6/site-packages/deeppavlov/deep.py", line 86, in main
    start_epoch_num=start_epoch_num)
  File "/home/user/.virtualenvs/support/lib/python3.6/site-packages/deeppavlov/core/commands/train.py", line 225, in train_evaluate_model_from_config
    model = fit_chainer(config, iterator)
  File "/home/user/.virtualenvs/support/lib/python3.6/site-packages/deeppavlov/core/commands/train.py", line 100, in fit_chainer
    component.save()
  File "/home/user/.virtualenvs/support/lib/python3.6/site-packages/deeppavlov/models/sklearn/sklearn_component.py", line 241, in save
    pickle.dump(self.model, f)
OverflowError: cannot serialize a bytes object larger than 4 GiB

I suggest pickle.dump should use protocol with version 4.0 Please note link below https://stackoverflow.com/questions/29704139/pickle-in-python3-doesnt-work-for-large-data-saving

Assignee
Assign to
Time tracking