# HG changeset patch # User jbe # Date 1434658568 -7200 # Node ID b2d024220782968d43ff34465cc2c5858aefa47f # Parent 4cf337821a520add4d0f2c01be46fe5658822a9c Renamed remaining_header_size_limit and remaining_body_size_limit to local "limit" variables diff -r 4cf337821a52 -r b2d024220782 moonbridge_http.lua --- a/moonbridge_http.lua Thu Jun 18 04:09:21 2015 +0200 +++ b/moonbridge_http.lua Thu Jun 18 22:16:08 2015 +0200 @@ -206,8 +206,9 @@ return function(socket) local socket_set = {[socket] = true} -- used for poll function local survive = true -- set to false if process shall be terminated later - local consume -- function that reads some input if possible - -- function that drains some input if possible: + local consume -- can be set to function that reads some input if possible + -- function that may be used as "consume" function + -- and which drains some input if possible: local function drain() local bytes, status = socket:drain_nb(input_chunk_size) if not bytes or status == "eof" then @@ -232,9 +233,6 @@ end -- handle requests in a loop: repeat - -- copy limits: - local remaining_header_size_limit = header_size_limit - local remaining_body_size_limit = body_size_limit -- table for caching nil values: local headers_value_nil = {} -- create a new request object with metatable: @@ -489,8 +487,9 @@ -- coroutine for request body processing: local function read_body() if request.headers_flags["Transfer-Encoding"]["chunked"] then + local limit = body_size_limit while true do - local line = read(32 + remaining_body_size_limit, "\n") + local line = read(32 + limit, "\n") local zeros, lenstr = string.match(line, "^(0*)([1-9A-Fa-f]+[0-9A-Fa-f]*)\r?\n$") local chunkext if lenstr then @@ -502,8 +501,8 @@ request_error(true, "400 Bad Request", "Encoding error while reading chunk of request body") end local len = tonumber("0x" .. lenstr) - remaining_body_size_limit = remaining_body_size_limit - (#zeros + #chunkext + len) - if remaining_body_size_limit < 0 then + limit = limit - (#zeros + #chunkext + len) + if limit < 0 then request_error(true, "413 Request Entity Too Large", "Request body size limit exceeded") end if len == 0 then break end @@ -514,10 +513,10 @@ end end while true do - local line = read(2 + remaining_body_size_limit, "\n") + local line = read(2 + limit, "\n") if line == "\r\n" or line == "\n" then break end - remaining_body_size_limit = remaining_body_size_limit - #line - if remaining_body_size_limit < 0 then + limit = limit - #line + if limit < 0 then request_error(true, "413 Request Entity Too Large", "Request body size limit exceeded while reading trailer section of chunked request body") end end @@ -966,13 +965,15 @@ end -- coroutine for reading headers: local function read_headers() + -- initialize limit: + local limit = header_size_limit -- read and parse request line: - local line = read_eof(remaining_header_size_limit, "\n") + local line = read_eof(limit, "\n") if not line then return false, survive end - remaining_header_size_limit = remaining_header_size_limit - #line - if remaining_header_size_limit == 0 then + limit = limit - #line + if limit == 0 then return false, request_error(false, "414 Request-URI Too Long") end local target, proto @@ -985,12 +986,12 @@ end -- read and parse headers: while true do - local line = read(remaining_header_size_limit, "\n"); - remaining_header_size_limit = remaining_header_size_limit - #line + local line = read(limit, "\n"); + limit = limit - #line if line == "\r\n" or line == "\n" then break end - if remaining_header_size_limit == 0 then + if limit == 0 then return false, request_error(false, "431 Request Header Fields Too Large") end local key, value = string.match(line, "^([^ \t\r]+):[ \t]*(.-)[ \t]*\r?\n$") @@ -1043,7 +1044,7 @@ return request_error(false, "400 Bad Request", "Content-Length header(s) invalid") end end - if request_body_content_length > remaining_body_size_limit then + if request_body_content_length > body_size_limit then return request_error(false, "413 Request Entity Too Large", "Announced request body size is too big") end end