I would love for sqlite to add import of JSON. They have great CSV import and export. If we could easily import, say, JSON arrays to a single column table, it would make JSON conversion to/from CSV easier. Especially since jq is slow and unmaintained.
There are couple of threads related to json import but a default mode will be helpful as there is already .mode json for output and import support will be great.
I guess you are trying to do below from your comment.
$ echo '["Kate", "John", "Jim"]' > names.json
$ ./sqlite3
SQLite version 3.38.0 2022-02-22 18:58:40
Enter ".help" for usage hints.
Connected to a transient in-memory database.
Use ".open FILENAME" to reopen on a persistent database.
sqlite> create table users(id integer primary key, name text);
sqlite> insert into users(name) select value from json_each(readfile('names.json'));
sqlite> select * from users;
1|Kate
2|John
3|Jim
It seems readfile is from an extension but present in cli. json_each is present in standard sqlite core from 3.38.0 as json1 extension is also now part of core.
> Note that the readfile(X) and writefile(X,Y) functions are extension functions and are not built into the core SQLite library. These routines are available as a loadable extension in the ext/misc/fileio.c source file in the SQLite source code repositories.
It works with YAML, TOML, JSON, jsonlines, CSV, and regular shell command output. You can import from any data format and convert to any other data format and even in line SQL relational look ups too.
Since SQL inlining literally just imports the data into an in memory sqlite3 database it means you can do your JSON import into sqlite3 using this shell. And in fact I did literally just this last month when using a cloud service restful API which returned two different JSON docs that needed to restructured into two different tables and then a relational query run between them.