This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm that you accept these cookies being set.

api, nesting too deep
#1
I was trying to make a script, but with reading this data i got the message 'nesting too deep' (below script).
Can i do anything to read also that part of the API?

Code:
require('json')
require('socket.http')
require("ltn12")
socket.http.TIMEOUT = 800
local data,_, code, headers, status, ret
local response = {}
        _, code, headers, status = socket.http.request{
            url='https://ergast.com/api/f1/current/last/results.json',
            sink = ltn12.sink.table(response),
       

                  }
ret = table.concat(response)
data = json.pdecode(ret)
if not data then
         log('Formule 1: cannot parse data')
return
end


log(data) -- sent data to log

Part of the results, some data i got, but deeper data i it dont, as the message shows.
Code:
["Results"]
          * table:
           [1]
            * table:
             nesting too deep
           [2]
            * table:
             nesting too deep
           [3]
            * table:
             nesting too deep
Reply
#2
Nesting limit is needed to prevent recursive tables. Without this limit such code will cause an infinite loop when logging:
Code:
t = {}
t.t = t
log(t)

You can use an external JSON viewer to find which fields you need from the whole JSON table. Keep in mind that Lua indexing starts from 1 instead of 0. This will log the relevant data:
Code:
require('json')
require('socket.http')

url = 'https://ergast.com/api/f1/current/last/results.json'
data = socket.http.request(url)
data = json.pdecode(data)

if type(data) == 'table' then
  log(data.MRData.RaceTable.Races)
end
Reply
#3
Thanks again, it is working
Reply


Forum Jump: