Python comes with a built-in json module. Just use json.load() or json.loads() to parse your JSON data. The first call reads from a string, the second on from a file, but in all other ways, they're identical.
There are a bunch of third-party modules (ujson, etc) which are faster, but fundamentally, they're all the same.
If I understand you correctly, you're reading a JSON document which is so large that if you store the converted data as a Python object, you run out of memory? If that's the case, I'm not sure if there's a good pure Python solution. I don't know of any json modules which parse, but don't store, the data.
Depending on what operating system you're on, there may be a command-line utility which parse JSON. For example, on Ubuntu linux, there's "json_xs". Perhaps shell out to that, use the "-t null" output format, redirect the output to /dev/null, and see what exit status you get:
# Good JSON
$ echo '[1, 2, 3]' | json_xs -t null 2>/dev/null; echo $?
0
# Bad JSON
$ echo '[1; 2, 3]' | json_xs -t null 2>/dev/null; echo $?
255
Wrap this up in a subprocess.check_output() call, and you're done.