Skip to content

Setting max_body_size for the underlying tornado server isn't working #3797

@mociarain

Description

@mociarain

Apparently, it’s configurable beyond the default limits of tornado (100mb) and is already set to ~.5Gb in the default values. But by setting “max_body_size” and “max_buffer_size” in the tornado_settings dict I should be able to go past this. (Git thread: #650).

To reproduce:

My Environment:

{
	'commit_hash': '7f10f7bb3',
	'commit_source': 'installation',
	'default_encoding': 'UTF-8',
	'ipython_version': '6.4.0',
	'os_name': 'posix',
	'platform': 'Darwin-17.6.0-x86_64-i386-64bit',
	'sys_platform': 'darwin',
	'sys_version': '3.6.5 (default, Mar 30 2018, 06:41:53) \n'
	'[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.39.2)]'
}

Running Mac OSX
Browser: Chrome Version 67.0.3396.99
matplotlib==2.2.2
numpy==1.15.0

Setup the server:

jupyter notebook --NotebookApp.tornado_settings="{'max_body_size': 504857600, 'max_buffer_size': 504857600}"

To generate a sufficiently large payload:

# Disable the autosave to stop chunking of the payload
%autosave 10000000000000

#import the modules
import numpy as np
import matplotlib.cm as cm
import matplotlib.pyplot as plt

import matplotlib as mpl
mpl.rcParams['figure.dpi']= 800

def print_big():
''' Running this generates an image of ~26mb'''
    rand_array=np.random.rand(10000,10000) #create your array
    plt.imshow(rand_array,cmap=cm.bone) #show your array with the selected colour
    plt.show() #show the image

Then run 'print_big()' in 4 separate cells. This generates a payload of ~108mb and you get the warning:
Malformed HTTP message from ::1: Content-Length too long

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions