should be it
This commit is contained in:
4
external/duckdb/tools/CMakeLists.txt
vendored
Normal file
4
external/duckdb/tools/CMakeLists.txt
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
if(NOT SUN AND BUILD_SHELL)
|
||||
add_subdirectory(sqlite3_api_wrapper)
|
||||
add_subdirectory(shell)
|
||||
endif()
|
||||
2
external/duckdb/tools/juliapkg/.gitignore
vendored
Normal file
2
external/duckdb/tools/juliapkg/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
Manifest.toml
|
||||
|
||||
7
external/duckdb/tools/juliapkg/LICENSE
vendored
Normal file
7
external/duckdb/tools/juliapkg/LICENSE
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
Copyright 2018-2024 Stichting DuckDB Foundation
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
149
external/duckdb/tools/juliapkg/README.md
vendored
Normal file
149
external/duckdb/tools/juliapkg/README.md
vendored
Normal file
@@ -0,0 +1,149 @@
|
||||
# Official DuckDB Julia Package
|
||||
|
||||
DuckDB is a high-performance in-process analytical database system. It is designed to be fast, reliable and easy to use. For more information on the goals of DuckDB, please refer to [the Why DuckDB page on our website](https://duckdb.org/why_duckdb).
|
||||
|
||||
The DuckDB Julia package provides a high-performance front-end for DuckDB. Much like SQLite, DuckDB runs in-process within the Julia client, and provides a DBInterface front-end.
|
||||
|
||||
The package also supports multi-threaded execution. It uses Julia threads/tasks for this purpose. If you wish to run queries in parallel, you must launch Julia with multi-threading support (by e.g. setting the `JULIA_NUM_THREADS` environment variable).
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
```julia
|
||||
pkg> add DuckDB
|
||||
|
||||
julia> using DuckDB
|
||||
```
|
||||
|
||||
## Basics
|
||||
|
||||
```julia
|
||||
# create a new in-memory database
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:")
|
||||
|
||||
# create a table
|
||||
DBInterface.execute(con, "CREATE TABLE integers(i INTEGER)")
|
||||
|
||||
# insert data using a prepared statement
|
||||
stmt = DBInterface.prepare(con, "INSERT INTO integers VALUES(?)")
|
||||
DBInterface.execute(stmt, [42])
|
||||
|
||||
# query the database
|
||||
results = DBInterface.execute(con, "SELECT 42 a")
|
||||
print(results)
|
||||
```
|
||||
|
||||
## Scanning DataFrames
|
||||
The DuckDB Julia package also provides support for querying Julia DataFrames. Note that the DataFrames are directly read by DuckDB - they are not inserted or copied into the database itself.
|
||||
|
||||
If you wish to load data from a DataFrame into a DuckDB table you can run a `CREATE TABLE AS` or `INSERT INTO` query.
|
||||
|
||||
```julia
|
||||
using DuckDB
|
||||
using DataFrames
|
||||
|
||||
# create a new in-memory database
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# create a DataFrame
|
||||
df = DataFrame(a = [1, 2, 3], b = [42, 84, 42])
|
||||
|
||||
# register it as a view in the database
|
||||
DuckDB.register_data_frame(con, df, "my_df")
|
||||
|
||||
# run a SQL query over the DataFrame
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
print(results)
|
||||
```
|
||||
|
||||
## Original Julia Connector
|
||||
Credits to kimmolinna for the [original DuckDB Julia connector](https://github.com/kimmolinna/DuckDB.jl).
|
||||
|
||||
## Contributing to the Julia Package
|
||||
|
||||
### Formatting
|
||||
The format script must be run when changing anything. This can be done by running the following command from within the root directory of the project:
|
||||
|
||||
```bash
|
||||
julia tools/juliapkg/scripts/format.jl
|
||||
```
|
||||
|
||||
### Testing
|
||||
|
||||
You can run the tests using the `test.sh` script:
|
||||
|
||||
```
|
||||
./test.sh
|
||||
```
|
||||
|
||||
Specific test files can be run by adding the name of the file as an argument:
|
||||
|
||||
```
|
||||
./test.sh test_connection.jl
|
||||
```
|
||||
|
||||
### Development
|
||||
|
||||
Build using `DISABLE_SANITIZER=1 make debug`
|
||||
|
||||
To run against a locally compiled version of duckdb, you'll need to set the `JULIA_DUCKDB_LIBRARY` environment variable, e.g.:
|
||||
|
||||
```bash
|
||||
export JULIA_DUCKDB_LIBRARY="`pwd`/../../build/debug/src/libduckdb.dylib"
|
||||
```
|
||||
|
||||
Note that Julia pre-compilation caching might get in the way of changes to this variable taking effect. You can clear these caches using the following command:
|
||||
|
||||
```bash
|
||||
rm -rf ~/.julia/compiled
|
||||
```
|
||||
|
||||
For development a few packages are required, these live in a Project.toml in the `test` directory, installed like so:
|
||||
|
||||
```bash
|
||||
cd tools/juliapkg
|
||||
```
|
||||
|
||||
```julia
|
||||
using Pkg
|
||||
Pkg.activate("./test")
|
||||
Pkg.instantiate()
|
||||
```
|
||||
|
||||
#### Debugging using LLDB
|
||||
|
||||
Julia's builtin version management system `juliaup` can get in the way of starting a process with lldb attached as it provides a shim for the `julia` binary.
|
||||
The actual `julia` binaries live in `~/.julia/juliaup/<version>/bin/julia`
|
||||
|
||||
`lldb -- julia ...` will likely not work and you'll need to provide the absolute path of the julia binary, e.g:
|
||||
```bash
|
||||
lldb -- ~/.julia/juliaup/julia-1.10.0+0.aarch64.apple.darwin14/bin/julia ...
|
||||
```
|
||||
|
||||
#### Testing
|
||||
|
||||
To run the test suite in it's entirety:
|
||||
```bash
|
||||
julia -e "import Pkg; Pkg.activate(\".\"); include(\"test/runtests.jl\")"
|
||||
```
|
||||
|
||||
To run a specific test listed in `test/runtests.jl`, you can provide the name, e.g:
|
||||
```bash
|
||||
julia -e "import Pkg; Pkg.activate(\".\"); include(\"test/runtests.jl\")" "test_basic_queries.jl"
|
||||
```
|
||||
|
||||
Just as mentioned before, to attach lldb to this, you'll have to replace the `julia` part with the absolute path.
|
||||
|
||||
### Automatic API generation
|
||||
|
||||
A base Julia wrapper around the C-API is generated using the `update_api.sh` script (which internally calls the python script `scripts/generate_c_api_julia.py`). This script uses the definitions of DuckDB C-API to automatically generate the Julia wrapper that is complete and consistent with the C-API. To generate the wrapper, just run:
|
||||
|
||||
```bash
|
||||
./update_api.sh
|
||||
```
|
||||
|
||||
|
||||
### Submitting a New Package
|
||||
The DuckDB Julia package depends on the [DuckDB_jll package](https://github.com/JuliaBinaryWrappers/DuckDB_jll.jl), which can be updated by sending a PR to [Yggdrassil](https://github.com/JuliaPackaging/Yggdrasil/pull/5049).
|
||||
|
||||
After the `DuckDB_jll` package is updated, the DuckDB package can be updated by incrementing the version number (and dependency version numbers) in `Project.toml`, followed by [adding a comment containing the text `@JuliaRegistrator register subdir=tools/juliapkg`](https://github.com/duckdb/duckdb/commit/88b59799f41fce7cbe166e5c33d0d5f6d480278d#commitcomment-76533721) to the commit.
|
||||
BIN
external/duckdb/tools/juliapkg/data/album.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/album.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/artist.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/artist.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/customer.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/customer.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/employee.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/employee.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/genre.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/genre.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/invoice.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/invoice.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/invoiceline.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/invoiceline.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/mediatype.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/mediatype.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/playlist.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/playlist.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/playlisttrack.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/playlisttrack.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/data/track.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/data/track.parquet
vendored
Normal file
Binary file not shown.
4
external/duckdb/tools/juliapkg/format.sh
vendored
Executable file
4
external/duckdb/tools/juliapkg/format.sh
vendored
Executable file
@@ -0,0 +1,4 @@
|
||||
set -e
|
||||
|
||||
cd ../..
|
||||
julia tools/juliapkg/scripts/format.jl
|
||||
16
external/duckdb/tools/juliapkg/format_check.sh
vendored
Executable file
16
external/duckdb/tools/juliapkg/format_check.sh
vendored
Executable file
@@ -0,0 +1,16 @@
|
||||
set -e
|
||||
|
||||
if [[ $(git diff) ]]; then
|
||||
echo "There are already differences prior to the format! Commit your changes prior to running format_check.sh"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
./format.sh
|
||||
if [[ $(git diff) ]]; then
|
||||
echo "Julia format found differences:"
|
||||
git diff
|
||||
exit 1
|
||||
else
|
||||
echo "No differences found"
|
||||
exit 0
|
||||
fi
|
||||
91
external/duckdb/tools/juliapkg/release.py
vendored
Normal file
91
external/duckdb/tools/juliapkg/release.py
vendored
Normal file
@@ -0,0 +1,91 @@
|
||||
import subprocess
|
||||
import os
|
||||
import argparse
|
||||
import re
|
||||
|
||||
parser = argparse.ArgumentParser(description='Publish a Julia release.')
|
||||
parser.add_argument(
|
||||
'--yggdrassil-fork',
|
||||
dest='yggdrassil',
|
||||
action='store',
|
||||
help='Fork of the Julia Yggdrassil repository (https://github.com/JuliaPackaging/Yggdrasil)',
|
||||
default='/Users/myth/Programs/Yggdrasil',
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
if not os.path.isfile(os.path.join('tools', 'juliapkg', 'release.py')):
|
||||
print('This script must be run from the root DuckDB directory (i.e. `python3 tools/juliapkg/release.py`)')
|
||||
exit(1)
|
||||
|
||||
|
||||
def run_syscall(syscall, ignore_failure=False):
|
||||
res = os.system(syscall)
|
||||
if ignore_failure:
|
||||
return
|
||||
if res != 0:
|
||||
print(f'Failed to execute {syscall}: got exit code {str(res)}')
|
||||
exit(1)
|
||||
|
||||
|
||||
# helper script to generate a julia release
|
||||
duckdb_path = os.getcwd()
|
||||
|
||||
# fetch the latest tags
|
||||
os.system('git fetch upstream --tags')
|
||||
|
||||
proc = subprocess.Popen(['git', 'show-ref', '--tags'], stdout=subprocess.PIPE)
|
||||
tags = [x for x in proc.stdout.read().decode('utf8').split('\n') if len(x) > 0 and 'master-builds' not in x]
|
||||
|
||||
|
||||
def extract_tag(x):
|
||||
keys = x.split('refs/tags/')[1].lstrip('v').split('.')
|
||||
return int(keys[0]) * 10000000 + int(keys[1]) * 10000 + int(keys[2])
|
||||
|
||||
|
||||
tags.sort(key=extract_tag)
|
||||
|
||||
# latest tag
|
||||
splits = tags[-1].split(' ')
|
||||
hash = splits[0]
|
||||
tag = splits[1].replace('refs/tags/', '')
|
||||
if tag[0] != 'v':
|
||||
print(f"Tag {tag} does not start with a v?")
|
||||
exit(1)
|
||||
|
||||
print(f'Creating a Julia release from the latest tag {tag} with commit hash {hash}')
|
||||
|
||||
print('> Creating a PR to the Yggdrassil repository (https://github.com/JuliaPackaging/Yggdrasil)')
|
||||
|
||||
os.chdir(args.yggdrassil)
|
||||
run_syscall('git checkout master')
|
||||
run_syscall('git pull upstream master')
|
||||
run_syscall(f'git branch -D {tag}', True)
|
||||
run_syscall(f'git checkout -b {tag}')
|
||||
tarball_build = os.path.join('D', 'DuckDB', 'build_tarballs.jl')
|
||||
with open(tarball_build, 'r') as f:
|
||||
text = f.read()
|
||||
|
||||
text = re.sub('\nversion = v["][0-9.]+["]\n', f'\nversion = v"{tag[1:]}"\n', text)
|
||||
text = re.sub(
|
||||
'GitSource[(]["]https[:][/][/]github[.]com[/]duckdb[/]duckdb[.]git["][,] ["][a-zA-Z0-9]+["][)]',
|
||||
f'GitSource("https://github.com/duckdb/duckdb.git", "{hash}")',
|
||||
text,
|
||||
)
|
||||
|
||||
with open(tarball_build, 'w+') as f:
|
||||
f.write(text)
|
||||
|
||||
run_syscall(f'git add {tarball_build}')
|
||||
run_syscall(f'git commit -m "[DuckDB] Bump to {tag}"')
|
||||
run_syscall(f'git push --set-upstream origin {tag}')
|
||||
run_syscall(
|
||||
f'gh pr create --title "[DuckDB] Bump to {tag}" --repo "https://github.com/JuliaPackaging/Yggdrasil" --body ""'
|
||||
)
|
||||
|
||||
print('PR has been created.\n')
|
||||
print(f'Next up we need to bump the version and DuckDB_jll version to {tag} in `tools/juliapkg/Project.toml`')
|
||||
print('This is not yet automated.')
|
||||
print(
|
||||
'> After that PR is merged - we need to post a comment containing the text `@JuliaRegistrator register subdir=tools/juliapkg`'
|
||||
)
|
||||
print('> For example, see https://github.com/duckdb/duckdb/commit/0f0461113f3341135471805c9928c4d71d1f5874')
|
||||
4
external/duckdb/tools/juliapkg/scripts/format.jl
vendored
Normal file
4
external/duckdb/tools/juliapkg/scripts/format.jl
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
using JuliaFormatter
|
||||
|
||||
format("tools/juliapkg/src")
|
||||
format("tools/juliapkg/test")
|
||||
143
external/duckdb/tools/juliapkg/scripts/generate_c_api.py
vendored
Normal file
143
external/duckdb/tools/juliapkg/scripts/generate_c_api.py
vendored
Normal file
@@ -0,0 +1,143 @@
|
||||
import os
|
||||
import json
|
||||
import re
|
||||
import glob
|
||||
import copy
|
||||
from packaging.version import Version
|
||||
from functools import reduce
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
EXT_API_DEFINITION_PATTERN = "src/include/duckdb/main/capi/header_generation/apis/v1/*/*.json"
|
||||
|
||||
# The JSON files that define all available CAPI functions
|
||||
CAPI_FUNCTION_DEFINITION_FILES = 'src/include/duckdb/main/capi/header_generation/functions/**/*.json'
|
||||
|
||||
|
||||
# The original order of the function groups in the duckdb.h files. We maintain this for easier PR reviews.
|
||||
# TODO: replace this with alphabetical ordering in a separate PR
|
||||
ORIGINAL_FUNCTION_GROUP_ORDER = [
|
||||
'open_connect',
|
||||
'configuration',
|
||||
'query_execution',
|
||||
'result_functions',
|
||||
'safe_fetch_functions',
|
||||
'helpers',
|
||||
'date_time_timestamp_helpers',
|
||||
'hugeint_helpers',
|
||||
'unsigned_hugeint_helpers',
|
||||
'decimal_helpers',
|
||||
'prepared_statements',
|
||||
'bind_values_to_prepared_statements',
|
||||
'execute_prepared_statements',
|
||||
'extract_statements',
|
||||
'pending_result_interface',
|
||||
'value_interface',
|
||||
'logical_type_interface',
|
||||
'data_chunk_interface',
|
||||
'vector_interface',
|
||||
'validity_mask_functions',
|
||||
'scalar_functions',
|
||||
'aggregate_functions',
|
||||
'table_functions',
|
||||
'table_function_bind',
|
||||
'table_function_init',
|
||||
'table_function',
|
||||
'replacement_scans',
|
||||
'profiling_info',
|
||||
'appender',
|
||||
'table_description',
|
||||
'arrow_interface',
|
||||
'threading_information',
|
||||
'streaming_result_interface',
|
||||
'cast_functions',
|
||||
'expression_interface',
|
||||
]
|
||||
|
||||
|
||||
def get_extension_api_version(ext_api_definitions):
|
||||
latest_version = ""
|
||||
|
||||
for version_entry in ext_api_definitions:
|
||||
if version_entry["version"].startswith("v"):
|
||||
latest_version = version_entry["version"]
|
||||
if version_entry["version"].startswith("unstable_"):
|
||||
break
|
||||
|
||||
return latest_version
|
||||
|
||||
|
||||
# Parse the CAPI_FUNCTION_DEFINITION_FILES to get the full list of functions
|
||||
def parse_capi_function_definitions(function_definition_file_pattern):
|
||||
# Collect all functions
|
||||
# function_files = glob.glob(CAPI_FUNCTION_DEFINITION_FILES, recursive=True)
|
||||
function_files = glob.glob(function_definition_file_pattern, recursive=True)
|
||||
|
||||
function_groups = []
|
||||
function_map = {}
|
||||
|
||||
# Read functions
|
||||
for file in function_files:
|
||||
with open(file, "r") as f:
|
||||
try:
|
||||
json_data = json.loads(f.read())
|
||||
except json.decoder.JSONDecodeError as err:
|
||||
print(f"Invalid JSON found in {file}: {err}")
|
||||
exit(1)
|
||||
|
||||
function_groups.append(json_data)
|
||||
for function in json_data["entries"]:
|
||||
if function["name"] in function_map:
|
||||
print(f"Duplicate symbol found when parsing C API file {file}: {function['name']}")
|
||||
exit(1)
|
||||
|
||||
function["group"] = json_data["group"]
|
||||
if "deprecated" in json_data:
|
||||
function["group_deprecated"] = json_data["deprecated"]
|
||||
|
||||
function_map[function["name"]] = function
|
||||
|
||||
# Reorder to match original order: purely intended to keep the PR review sane
|
||||
function_groups_ordered = []
|
||||
|
||||
if len(function_groups) != len(ORIGINAL_FUNCTION_GROUP_ORDER):
|
||||
print(
|
||||
"The list used to match the original order of function groups in the original the duckdb.h file does not match the new one. Did you add a new function group? please also add it to ORIGINAL_FUNCTION_GROUP_ORDER for now."
|
||||
)
|
||||
|
||||
for order_group in ORIGINAL_FUNCTION_GROUP_ORDER:
|
||||
curr_group = next(group for group in function_groups if group["group"] == order_group)
|
||||
function_groups.remove(curr_group)
|
||||
function_groups_ordered.append(curr_group)
|
||||
|
||||
return (function_groups_ordered, function_map)
|
||||
|
||||
|
||||
# Read extension API
|
||||
def parse_ext_api_definitions(ext_api_definition):
|
||||
api_definitions = {}
|
||||
versions = []
|
||||
dev_versions = []
|
||||
for file in list(glob.glob(ext_api_definition)):
|
||||
with open(file, "r") as f:
|
||||
try:
|
||||
obj = json.loads(f.read())
|
||||
api_definitions[obj["version"]] = obj
|
||||
if obj["version"].startswith("unstable_"):
|
||||
dev_versions.append(obj["version"])
|
||||
else:
|
||||
if Path(file).stem != obj["version"]:
|
||||
print(
|
||||
f"\nMismatch between filename and version in file for {file}. Note that unstable versions should have a version starting with 'unstable_' and that stable versions should have the version as their filename"
|
||||
)
|
||||
exit(1)
|
||||
versions.append(obj["version"])
|
||||
|
||||
except json.decoder.JSONDecodeError as err:
|
||||
print(f"\nInvalid JSON found in {file}: {err}")
|
||||
exit(1)
|
||||
|
||||
versions.sort(key=Version)
|
||||
dev_versions.sort()
|
||||
|
||||
return [api_definitions[x] for x in (versions + dev_versions)]
|
||||
918
external/duckdb/tools/juliapkg/scripts/generate_c_api_julia.py
vendored
Normal file
918
external/duckdb/tools/juliapkg/scripts/generate_c_api_julia.py
vendored
Normal file
@@ -0,0 +1,918 @@
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import pathlib
|
||||
import re
|
||||
from types import NoneType
|
||||
from typing import Dict, List, NotRequired, TypedDict, Union
|
||||
|
||||
from generate_c_api import (
|
||||
EXT_API_DEFINITION_PATTERN,
|
||||
get_extension_api_version,
|
||||
parse_capi_function_definitions,
|
||||
parse_ext_api_definitions,
|
||||
)
|
||||
|
||||
|
||||
class FunctionDefParam(TypedDict):
|
||||
type: str
|
||||
name: str
|
||||
|
||||
|
||||
class FunctionDefComment(TypedDict):
|
||||
description: str
|
||||
param_comments: dict[str, str]
|
||||
return_value: str
|
||||
|
||||
|
||||
class FunctionDef(TypedDict):
|
||||
name: str
|
||||
group: str
|
||||
deprecated: bool
|
||||
group_deprecated: bool
|
||||
return_type: str
|
||||
params: list[FunctionDefParam]
|
||||
comment: FunctionDefComment
|
||||
|
||||
|
||||
class FunctionGroup(TypedDict):
|
||||
group: str
|
||||
deprecated: bool
|
||||
entries: list[FunctionDef]
|
||||
|
||||
|
||||
class DuckDBApiInfo(TypedDict):
|
||||
version: str
|
||||
commit: NotRequired[str]
|
||||
|
||||
|
||||
def parse_c_type(type_str: str, type: list[str] = []):
|
||||
"""Parses simple C types (no function pointer or array types) and returns a list of the type components.
|
||||
|
||||
Args:
|
||||
type_str: A C type string to parse, e.g.: "const char* const"
|
||||
type: List to track components, used for recursion. Defaults to [].
|
||||
|
||||
Returns:
|
||||
list: A list of the type components, e.g.: "const char* const" -> ["Const Ptr", "const char"]
|
||||
"""
|
||||
type_str = type_str.strip()
|
||||
ptr_pattern = r"^(.*)\*(\s*const\s*)?$"
|
||||
|
||||
if (m1 := re.match(ptr_pattern, type_str)) is not None:
|
||||
before_ptr = m1.group(1)
|
||||
is_const = bool(m1.group(2))
|
||||
type.append("Const Ptr" if is_const else "Ptr")
|
||||
return parse_c_type(before_ptr, type)
|
||||
|
||||
type.append(type_str)
|
||||
return type
|
||||
|
||||
|
||||
JULIA_RESERVED_KEYWORDS = {
|
||||
"function",
|
||||
"if",
|
||||
"else",
|
||||
"while",
|
||||
"for",
|
||||
"try",
|
||||
"catch",
|
||||
"finally",
|
||||
"return",
|
||||
"break",
|
||||
"continue",
|
||||
"end",
|
||||
"begin",
|
||||
"quote",
|
||||
"let",
|
||||
"local",
|
||||
"global",
|
||||
"const",
|
||||
"do",
|
||||
"struct",
|
||||
"mutable",
|
||||
"abstract",
|
||||
"type",
|
||||
"module",
|
||||
"using",
|
||||
"import",
|
||||
"export",
|
||||
"public",
|
||||
}
|
||||
|
||||
JULIA_BASE_TYPE_MAP = {
|
||||
# Julia Standard Types
|
||||
"char": "Char",
|
||||
"int": "Int",
|
||||
"int8_t": "Int8",
|
||||
"int16_t": "Int16",
|
||||
"int32_t": "Int32",
|
||||
"int64_t": "Int64",
|
||||
"uint8_t": "UInt8",
|
||||
"uint16_t": "UInt16",
|
||||
"uint32_t": "UInt32",
|
||||
"uint64_t": "UInt64",
|
||||
"double": "Float64",
|
||||
"float": "Float32",
|
||||
"bool": "Bool",
|
||||
"void": "Cvoid",
|
||||
"size_t": "Csize_t",
|
||||
# DuckDB specific types
|
||||
"idx_t": "idx_t",
|
||||
"duckdb_type": "DUCKDB_TYPE",
|
||||
"duckdb_string_t": "duckdb_string_t", # INLINE prefix with pointer string type
|
||||
"duckdb_string": "duckdb_string", # Pointer + size type
|
||||
"duckdb_table_function": "duckdb_table_function", # actually struct pointer
|
||||
"duckdb_table_function_t": "duckdb_table_function_ptr", # function pointer type
|
||||
"duckdb_cast_function": "duckdb_cast_function", # actually struct pointer
|
||||
"duckdb_cast_function_t": "duckdb_cast_function_ptr", # function pointer type
|
||||
}
|
||||
|
||||
|
||||
# TODO this the original order of the functions in `api.jl` and is only used to keep the PR review small
|
||||
JULIA_API_ORIGINAL_ORDER = [
|
||||
"duckdb_open",
|
||||
"duckdb_open_ext",
|
||||
"duckdb_close",
|
||||
"duckdb_connect",
|
||||
"duckdb_disconnect",
|
||||
"duckdb_create_config",
|
||||
"duckdb_config_count",
|
||||
"duckdb_get_config_flag",
|
||||
"duckdb_set_config",
|
||||
"duckdb_destroy_config",
|
||||
"duckdb_query",
|
||||
"duckdb_destroy_result",
|
||||
"duckdb_column_name",
|
||||
"duckdb_column_type",
|
||||
"duckdb_column_logical_type",
|
||||
"duckdb_column_count",
|
||||
"duckdb_row_count",
|
||||
"duckdb_rows_changed",
|
||||
"duckdb_column_data",
|
||||
"duckdb_nullmask_data",
|
||||
"duckdb_result_error",
|
||||
"duckdb_result_get_chunk",
|
||||
"duckdb_result_is_streaming",
|
||||
"duckdb_stream_fetch_chunk",
|
||||
"duckdb_result_chunk_count",
|
||||
"duckdb_value_boolean",
|
||||
"duckdb_value_int8",
|
||||
"duckdb_value_int16",
|
||||
"duckdb_value_int32",
|
||||
"duckdb_value_int64",
|
||||
"duckdb_value_hugeint",
|
||||
"duckdb_value_uhugeint",
|
||||
"duckdb_value_uint8",
|
||||
"duckdb_value_uint16",
|
||||
"duckdb_value_uint32",
|
||||
"duckdb_value_uint64",
|
||||
"duckdb_value_float",
|
||||
"duckdb_value_double",
|
||||
"duckdb_value_date",
|
||||
"duckdb_value_time",
|
||||
"duckdb_value_timestamp",
|
||||
"duckdb_value_interval",
|
||||
"duckdb_value_varchar",
|
||||
"duckdb_value_varchar_internal",
|
||||
"duckdb_value_is_null",
|
||||
"duckdb_malloc",
|
||||
"duckdb_free",
|
||||
"duckdb_vector_size",
|
||||
"duckdb_from_time_tz",
|
||||
"duckdb_prepare",
|
||||
"duckdb_destroy_prepare",
|
||||
"duckdb_prepare_error",
|
||||
"duckdb_nparams",
|
||||
"duckdb_param_type",
|
||||
"duckdb_bind_boolean",
|
||||
"duckdb_bind_int8",
|
||||
"duckdb_bind_int16",
|
||||
"duckdb_bind_int32",
|
||||
"duckdb_bind_int64",
|
||||
"duckdb_bind_hugeint",
|
||||
"duckdb_bind_uhugeint",
|
||||
"duckdb_bind_uint8",
|
||||
"duckdb_bind_uint16",
|
||||
"duckdb_bind_uint32",
|
||||
"duckdb_bind_uint64",
|
||||
"duckdb_bind_float",
|
||||
"duckdb_bind_double",
|
||||
"duckdb_bind_date",
|
||||
"duckdb_bind_time",
|
||||
"duckdb_bind_timestamp",
|
||||
"duckdb_bind_interval",
|
||||
"duckdb_bind_varchar",
|
||||
"duckdb_bind_varchar_length",
|
||||
"duckdb_bind_blob",
|
||||
"duckdb_bind_null",
|
||||
"duckdb_execute_prepared",
|
||||
"duckdb_pending_prepared",
|
||||
"duckdb_pending_prepared_streaming",
|
||||
"duckdb_pending_execute_check_state",
|
||||
"duckdb_destroy_pending",
|
||||
"duckdb_pending_error",
|
||||
"duckdb_pending_execute_task",
|
||||
"duckdb_execute_pending",
|
||||
"duckdb_pending_execution_is_finished",
|
||||
"duckdb_destroy_value",
|
||||
"duckdb_create_varchar",
|
||||
"duckdb_create_varchar_length",
|
||||
"duckdb_create_int64",
|
||||
"duckdb_get_varchar",
|
||||
"duckdb_get_int64",
|
||||
"duckdb_create_logical_type",
|
||||
"duckdb_create_decimal_type",
|
||||
"duckdb_get_type_id",
|
||||
"duckdb_decimal_width",
|
||||
"duckdb_decimal_scale",
|
||||
"duckdb_decimal_internal_type",
|
||||
"duckdb_enum_internal_type",
|
||||
"duckdb_enum_dictionary_size",
|
||||
"duckdb_enum_dictionary_value",
|
||||
"duckdb_list_type_child_type",
|
||||
"duckdb_struct_type_child_count",
|
||||
"duckdb_union_type_member_count",
|
||||
"duckdb_struct_type_child_name",
|
||||
"duckdb_union_type_member_name",
|
||||
"duckdb_struct_type_child_type",
|
||||
"duckdb_union_type_member_type",
|
||||
"duckdb_destroy_logical_type",
|
||||
"duckdb_create_data_chunk",
|
||||
"duckdb_destroy_data_chunk",
|
||||
"duckdb_data_chunk_reset",
|
||||
"duckdb_data_chunk_get_column_count",
|
||||
"duckdb_data_chunk_get_size",
|
||||
"duckdb_data_chunk_set_size",
|
||||
"duckdb_data_chunk_get_vector",
|
||||
"duckdb_vector_get_column_type",
|
||||
"duckdb_vector_get_data",
|
||||
"duckdb_vector_get_validity",
|
||||
"duckdb_vector_ensure_validity_writable",
|
||||
"duckdb_list_vector_get_child",
|
||||
"duckdb_list_vector_get_size",
|
||||
"duckdb_struct_vector_get_child",
|
||||
"duckdb_union_vector_get_member",
|
||||
"duckdb_vector_assign_string_element",
|
||||
"duckdb_vector_assign_string_element_len",
|
||||
"duckdb_create_table_function",
|
||||
"duckdb_destroy_table_function",
|
||||
"duckdb_table_function_set_name",
|
||||
"duckdb_table_function_add_parameter",
|
||||
"duckdb_table_function_set_extra_info",
|
||||
"duckdb_table_function_set_bind",
|
||||
"duckdb_table_function_set_init",
|
||||
"duckdb_table_function_set_local_init",
|
||||
"duckdb_table_function_set_function",
|
||||
"duckdb_table_function_supports_projection_pushdown",
|
||||
"duckdb_register_table_function",
|
||||
"duckdb_bind_get_extra_info",
|
||||
"duckdb_bind_add_result_column",
|
||||
"duckdb_bind_get_parameter_count",
|
||||
"duckdb_bind_get_parameter",
|
||||
"duckdb_bind_set_bind_data",
|
||||
"duckdb_bind_set_cardinality",
|
||||
"duckdb_bind_set_error",
|
||||
"duckdb_init_get_extra_info",
|
||||
"duckdb_init_get_bind_data",
|
||||
"duckdb_init_set_init_data",
|
||||
"duckdb_init_get_column_count",
|
||||
"duckdb_init_get_column_index",
|
||||
"duckdb_init_set_max_threads",
|
||||
"duckdb_init_set_error",
|
||||
"duckdb_function_get_extra_info",
|
||||
"duckdb_function_get_bind_data",
|
||||
"duckdb_function_get_init_data",
|
||||
"duckdb_function_get_local_init_data",
|
||||
"duckdb_function_set_error",
|
||||
"duckdb_add_replacement_scan",
|
||||
"duckdb_replacement_scan_set_function_name",
|
||||
"duckdb_replacement_scan_add_parameter",
|
||||
"duckdb_replacement_scan_set_error",
|
||||
"duckdb_appender_create",
|
||||
"duckdb_appender_error",
|
||||
"duckdb_appender_flush",
|
||||
"duckdb_appender_close",
|
||||
"duckdb_appender_destroy",
|
||||
"duckdb_appender_begin_row",
|
||||
"duckdb_appender_end_row",
|
||||
"duckdb_append_bool",
|
||||
"duckdb_append_int8",
|
||||
"duckdb_append_int16",
|
||||
"duckdb_append_int32",
|
||||
"duckdb_append_int64",
|
||||
"duckdb_append_hugeint",
|
||||
"duckdb_append_uhugeint",
|
||||
"duckdb_append_uint8",
|
||||
"duckdb_append_uint16",
|
||||
"duckdb_append_uint32",
|
||||
"duckdb_append_uint64",
|
||||
"duckdb_append_float",
|
||||
"duckdb_append_double",
|
||||
"duckdb_append_date",
|
||||
"duckdb_append_time",
|
||||
"duckdb_append_timestamp",
|
||||
"duckdb_append_interval",
|
||||
"duckdb_append_varchar",
|
||||
"duckdb_append_varchar_length",
|
||||
"duckdb_append_blob",
|
||||
"duckdb_append_null",
|
||||
"duckdb_execute_tasks",
|
||||
"duckdb_create_task_state",
|
||||
"duckdb_execute_tasks_state",
|
||||
"duckdb_execute_n_tasks_state",
|
||||
"duckdb_finish_execution",
|
||||
"duckdb_task_state_is_finished",
|
||||
"duckdb_destroy_task_state",
|
||||
"duckdb_execution_is_finished",
|
||||
"duckdb_create_scalar_function",
|
||||
"duckdb_destroy_scalar_function",
|
||||
"duckdb_scalar_function_set_name",
|
||||
"duckdb_scalar_function_add_parameter",
|
||||
"duckdb_scalar_function_set_return_type",
|
||||
"duckdb_scalar_function_set_function",
|
||||
"duckdb_register_scalar_function",
|
||||
]
|
||||
|
||||
|
||||
class JuliaApiTarget:
|
||||
indent: int = 0
|
||||
linesep: str = os.linesep
|
||||
type_maps: dict[str, str] = {} # C to Julia
|
||||
inverse_type_maps: dict[str, list[str]] = {} # Julia to C
|
||||
deprecated_functions: list[str] = []
|
||||
type_map: dict[str, str]
|
||||
|
||||
# Functions to skip
|
||||
skipped_functions = set()
|
||||
skip_deprecated_functions = False
|
||||
|
||||
# Explicit function order
|
||||
manual_order: Union[List[str], NoneType] = None
|
||||
|
||||
overwrite_function_signatures = {}
|
||||
|
||||
# Functions that use indices either as ARG or RETURN and should be converted to 1-based indexing
|
||||
auto_1base_index: bool
|
||||
auto_1base_index_return_functions = set()
|
||||
auto_1base_index_ignore_functions = set()
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
file,
|
||||
indent=0,
|
||||
auto_1base_index=True,
|
||||
auto_1base_index_return_functions=set(),
|
||||
auto_1base_index_ignore_functions=set(),
|
||||
skipped_functions=set(),
|
||||
skip_deprecated_functions=False,
|
||||
type_map={},
|
||||
overwrite_function_signatures={},
|
||||
):
|
||||
# check if file is a string or a file object
|
||||
if isinstance(file, str) or isinstance(file, pathlib.Path):
|
||||
self.filename = pathlib.Path(file)
|
||||
else:
|
||||
raise ValueError("file must be a string or a path object")
|
||||
self.indent = indent
|
||||
self.auto_1base_index = auto_1base_index
|
||||
self.auto_1base_index_return_functions = auto_1base_index_return_functions
|
||||
self.auto_1base_index_ignore_functions = auto_1base_index_ignore_functions
|
||||
self.linesep = os.linesep
|
||||
self.type_map = type_map
|
||||
self.skipped_functions = skipped_functions
|
||||
self.skip_deprecated_functions = skip_deprecated_functions
|
||||
self.overwrite_function_signatures = overwrite_function_signatures
|
||||
super().__init__()
|
||||
|
||||
def __enter__(self):
|
||||
self.file = open(self.filename, "w")
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_value, traceback):
|
||||
self.file.close()
|
||||
|
||||
def write_empty_line(self, n=1) -> None:
|
||||
"""Writes an empty line to the output file."""
|
||||
for i in range(n):
|
||||
self.file.write(self.linesep)
|
||||
|
||||
def _get_casted_type(self, type_str: str, is_return_arg=False, auto_remove_t_suffix=True):
|
||||
type_str = type_str.strip()
|
||||
type_definition = parse_c_type(type_str, [])
|
||||
|
||||
def reduce_type(type_list: list[str]):
|
||||
if len(type_list) == 0:
|
||||
return ""
|
||||
|
||||
t = type_list[0]
|
||||
if len(type_list) == 1:
|
||||
is_const = False # Track that the type is const, even though we cannot use it in Julia
|
||||
if t.startswith("const "):
|
||||
t, is_const = t.removeprefix("const "), True
|
||||
|
||||
if t in self.type_map:
|
||||
return self.type_map[t]
|
||||
else:
|
||||
if auto_remove_t_suffix and t.endswith("_t"):
|
||||
t = t.removesuffix("_t")
|
||||
if " " in t:
|
||||
raise (ValueError(f"Unknown type: {t}"))
|
||||
return t
|
||||
|
||||
# Handle Pointer types
|
||||
if t not in ("Ptr", "Const Ptr"):
|
||||
raise ValueError(f"Unexpected non-pointer type: {t}")
|
||||
|
||||
if len(type_list) >= 2 and type_list[1].strip() in (
|
||||
"char",
|
||||
"const char",
|
||||
):
|
||||
return "Cstring"
|
||||
else:
|
||||
if is_return_arg:
|
||||
# Use Ptr for return types, because they are not tracked by the Julia GC
|
||||
return "Ptr{" + reduce_type(type_list[1:]) + "}"
|
||||
else:
|
||||
# Prefer Ref over Ptr for arguments
|
||||
return "Ref{" + reduce_type(type_list[1:]) + "}"
|
||||
|
||||
return reduce_type(type_definition)
|
||||
|
||||
def _is_index_argument(self, name: str, function_obj: FunctionDef):
|
||||
# Check if the argument is (likely) an index
|
||||
if name not in (
|
||||
"index",
|
||||
"idx",
|
||||
"i",
|
||||
"row",
|
||||
"col",
|
||||
"column",
|
||||
"col_idx",
|
||||
"column_idx",
|
||||
"column_index",
|
||||
"row_idx",
|
||||
"row_index",
|
||||
"chunk_index",
|
||||
# "param_idx", # TODO creates errors in bind_param
|
||||
):
|
||||
return False
|
||||
|
||||
x = None
|
||||
for param in function_obj["params"]:
|
||||
if param["name"] == name:
|
||||
x = param
|
||||
break
|
||||
|
||||
arg_type = self._get_casted_type(x["type"])
|
||||
if arg_type not in (
|
||||
"Int",
|
||||
"Int64",
|
||||
"UInt",
|
||||
"UInt64",
|
||||
"idx_t",
|
||||
"idx" "Int32",
|
||||
"UInt32",
|
||||
"Csize_t",
|
||||
):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def get_argument_names_and_types(self, function_obj: FunctionDef):
|
||||
def _get_arg_name(name: str):
|
||||
if name in JULIA_RESERVED_KEYWORDS:
|
||||
return f"_{name}"
|
||||
return name
|
||||
|
||||
arg_names = [_get_arg_name(param["name"]) for param in function_obj["params"]]
|
||||
|
||||
if function_obj["name"] in self.overwrite_function_signatures:
|
||||
return_type, arg_types = self.overwrite_function_signatures[function_obj["name"]]
|
||||
return arg_names, arg_types
|
||||
|
||||
arg_types = [self._get_casted_type(param["type"]) for param in function_obj["params"]]
|
||||
return arg_names, arg_types
|
||||
|
||||
def is_index1_function(self, function_obj: FunctionDef):
|
||||
fname = function_obj["name"]
|
||||
|
||||
if not self.auto_1base_index:
|
||||
return [False for param in function_obj["params"]], False
|
||||
|
||||
if fname in self.auto_1base_index_ignore_functions:
|
||||
return [False for param in function_obj["params"]], False
|
||||
|
||||
is_index1_return = fname in self.auto_1base_index_return_functions
|
||||
is_index1_arg = [self._is_index_argument(param["name"], function_obj) for param in function_obj["params"]]
|
||||
return is_index1_arg, is_index1_return
|
||||
|
||||
def _write_function_docstring(self, function_obj: FunctionDef):
|
||||
r"""_create_function_docstring
|
||||
|
||||
|
||||
Example:
|
||||
```julia
|
||||
\"\"\"
|
||||
duckdb_get_int64(value)
|
||||
|
||||
Obtains an int64 of the given value.
|
||||
|
||||
# Arguments
|
||||
- `value`: The value
|
||||
|
||||
Returns: The int64 value, or 0 if no conversion is possible
|
||||
\"\"\"
|
||||
```
|
||||
|
||||
Args:
|
||||
function_obj: _description_
|
||||
"""
|
||||
|
||||
description = function_obj.get("comment", {}).get("description", "").strip()
|
||||
description = description.replace('"', '\\"') # escape double quotes
|
||||
|
||||
index1_args, index1_return = self.is_index1_function(function_obj)
|
||||
|
||||
# Arguments
|
||||
arg_names, arg_types = self.get_argument_names_and_types(function_obj)
|
||||
|
||||
arg_comments = []
|
||||
for ix, (name, param, t, is_index1) in enumerate(
|
||||
zip(arg_names, function_obj["params"], arg_types, index1_args)
|
||||
):
|
||||
param_comment = function_obj.get("comment", {}).get("param_comments", {}).get(param["name"], "")
|
||||
if is_index1:
|
||||
parts = [f"`{name}`:", f"`{t}`", "(1-based index)", param_comment]
|
||||
else:
|
||||
parts = [f"`{name}`:", f"`{t}`", param_comment]
|
||||
arg_comments.append(" ".join(parts))
|
||||
|
||||
arg_names_s = ", ".join(arg_names)
|
||||
|
||||
# Return Values
|
||||
return_type = self._get_casted_type(function_obj["return_type"], is_return_arg=True)
|
||||
if return_type == "Cvoid":
|
||||
return_type = "Nothing" # Cvoid is equivalent to Nothing in Julia
|
||||
return_comments = [
|
||||
f"`{return_type}`",
|
||||
function_obj.get("comment", {}).get("return_value", ""),
|
||||
]
|
||||
if index1_return:
|
||||
return_comments.append("(1-based index)")
|
||||
return_value_comment = " ".join(return_comments)
|
||||
|
||||
self.file.write(f"{' ' * self.indent}\"\"\"\n")
|
||||
self.file.write(f"{' ' * self.indent} {function_obj['name']}({arg_names_s})\n")
|
||||
self.file.write(f"{' ' * self.indent}\n")
|
||||
self.file.write(f"{' ' * self.indent}{description}\n")
|
||||
self.file.write(f"{' ' * self.indent}\n")
|
||||
self.file.write(f"{' ' * self.indent}# Arguments\n")
|
||||
for i, arg_name in enumerate(arg_names):
|
||||
self.file.write(f"{' ' * self.indent}- {arg_comments[i]}\n")
|
||||
self.file.write(f"{' ' * self.indent}\n")
|
||||
self.file.write(f"{' ' * self.indent}Returns: {return_value_comment}\n")
|
||||
self.file.write(f"{' ' * self.indent}\"\"\"\n")
|
||||
|
||||
def _get_depwarning_message(self, function_obj: FunctionDef):
|
||||
description = function_obj.get("comment", {}).get("description", "")
|
||||
if not description.startswith("**DEPRECATION NOTICE**:"):
|
||||
description = f"**DEPRECATION NOTICE**: {description}"
|
||||
|
||||
# Only use the first line of the description
|
||||
notice = description.split("\n")[0]
|
||||
notice = notice.replace("\n", " ").replace('"', '\\"').strip()
|
||||
return notice
|
||||
|
||||
def _write_function_depwarn(self, function_obj: FunctionDef, indent: int = 0):
|
||||
"""
|
||||
Writes a deprecation warning for a function.
|
||||
|
||||
Example:
|
||||
```julia
|
||||
Base.depwarn(
|
||||
"The `G` type parameter will be deprecated in a future release. " *
|
||||
"Please use `MyType(args...)` instead of `MyType{$G}(args...)`.",
|
||||
:MyType,
|
||||
)
|
||||
```
|
||||
"""
|
||||
indent = self.indent + indent # total indent
|
||||
|
||||
notice = self._get_depwarning_message(function_obj)
|
||||
|
||||
self.file.write(f"{' ' * indent}Base.depwarn(\n")
|
||||
self.file.write(f"{' ' * indent} \"{notice}\",\n")
|
||||
self.file.write(f"{' ' * indent} :{function_obj['name']},\n")
|
||||
self.file.write(f"{' ' * indent})\n")
|
||||
|
||||
def _list_to_julia_tuple(self, lst):
|
||||
if len(lst) == 0:
|
||||
return "()"
|
||||
elif len(lst) == 1:
|
||||
return f"({lst[0]},)"
|
||||
else:
|
||||
return f"({', '.join(lst)})"
|
||||
|
||||
def _write_function_definition(self, function_obj: FunctionDef):
|
||||
fname = function_obj["name"]
|
||||
index1_args, index1_return = self.is_index1_function(function_obj)
|
||||
|
||||
arg_names, arg_types = self.get_argument_names_and_types(function_obj)
|
||||
arg_types_tuple = self._list_to_julia_tuple(arg_types)
|
||||
arg_names_definition = ", ".join(arg_names)
|
||||
|
||||
arg_names_call = []
|
||||
for arg_name, is_index1 in zip(arg_names, index1_args):
|
||||
if is_index1:
|
||||
arg_names_call.append(f"{arg_name} - 1")
|
||||
else:
|
||||
arg_names_call.append(arg_name)
|
||||
arg_names_call = ", ".join(arg_names_call)
|
||||
|
||||
return_type = self._get_casted_type(function_obj["return_type"], is_return_arg=True)
|
||||
|
||||
self.file.write(f"{' ' * self.indent}function {fname}({arg_names_definition})\n")
|
||||
|
||||
if function_obj.get("group_deprecated", False) or function_obj.get("deprecated", False):
|
||||
self._write_function_depwarn(function_obj, indent=1)
|
||||
|
||||
self.file.write(
|
||||
f"{' ' * self.indent} return ccall((:{fname}, libduckdb), {return_type}, {arg_types_tuple}, {arg_names_call}){' + 1' if index1_return else ''}\n"
|
||||
)
|
||||
self.file.write(f"{' ' * self.indent}end\n")
|
||||
|
||||
def write_function(self, function_obj: FunctionDef):
|
||||
if function_obj["name"] in self.skipped_functions:
|
||||
return
|
||||
|
||||
if function_obj.get("group_deprecated", False) or function_obj.get("deprecated", False):
|
||||
self.deprecated_functions.append(function_obj["name"])
|
||||
|
||||
self._write_function_docstring(function_obj)
|
||||
self._write_function_definition(function_obj)
|
||||
|
||||
def write_footer(self):
|
||||
self.write_empty_line(n=1)
|
||||
s = """
|
||||
# !!!!!!!!!!!!
|
||||
# WARNING: this file is autogenerated by scripts/generate_c_api_julia.py, manual changes will be overwritten
|
||||
# !!!!!!!!!!!!
|
||||
"""
|
||||
self.file.write(s)
|
||||
self.write_empty_line()
|
||||
|
||||
def write_header(self, version=""):
|
||||
s = """
|
||||
###############################################################################
|
||||
#
|
||||
# DuckDB Julia API
|
||||
#
|
||||
# !!!!!!!!!!!!
|
||||
# WARNING: this file is autogenerated by scripts/generate_c_api_julia.py, manual changes will be overwritten
|
||||
# !!!!!!!!!!!!
|
||||
#
|
||||
###############################################################################
|
||||
|
||||
using Base.Libc
|
||||
|
||||
if "JULIA_DUCKDB_LIBRARY" in keys(ENV)
|
||||
libduckdb = ENV["JULIA_DUCKDB_LIBRARY"]
|
||||
else
|
||||
using DuckDB_jll
|
||||
end
|
||||
"""
|
||||
if version[0] == "v":
|
||||
# remove the v prefix and use Julia Version String
|
||||
version = version[1:]
|
||||
|
||||
self.file.write(s)
|
||||
self.file.write("\n")
|
||||
self.file.write(f'DUCKDB_API_VERSION = v"{version}"\n')
|
||||
self.file.write("\n")
|
||||
|
||||
def write_functions(
|
||||
self,
|
||||
version,
|
||||
function_groups: List[FunctionGroup],
|
||||
function_map: Dict[str, FunctionDef],
|
||||
):
|
||||
self._analyze_types(function_groups) # Create the julia type map
|
||||
self.write_header(version)
|
||||
self.write_empty_line()
|
||||
if self.manual_order is not None:
|
||||
current_group = None
|
||||
for f in self.manual_order:
|
||||
if f not in function_map:
|
||||
print(f"WARNING: Function {f} not found in function_map")
|
||||
continue
|
||||
|
||||
if current_group != function_map[f]["group"]:
|
||||
current_group = function_map[f]["group"]
|
||||
self.write_group_start(current_group)
|
||||
self.write_empty_line()
|
||||
|
||||
self.write_function(function_map[f])
|
||||
self.write_empty_line()
|
||||
|
||||
# Write new functions
|
||||
self.write_empty_line(n=1)
|
||||
self.write_group_start("New Functions")
|
||||
self.write_empty_line(n=2)
|
||||
current_group = None
|
||||
for group in function_groups:
|
||||
for fn in group["entries"]:
|
||||
if fn["name"] in self.manual_order:
|
||||
continue
|
||||
if current_group != group["group"]:
|
||||
current_group = group["group"]
|
||||
self.write_group_start(current_group)
|
||||
self.write_empty_line()
|
||||
|
||||
self.write_function(fn)
|
||||
self.write_empty_line()
|
||||
|
||||
else:
|
||||
for group in function_groups:
|
||||
self.write_group_start(group["group"])
|
||||
self.write_empty_line()
|
||||
for fn in group["entries"]:
|
||||
self.write_function(fn)
|
||||
self.write_empty_line()
|
||||
self.write_empty_line()
|
||||
self.write_empty_line()
|
||||
|
||||
self.write_footer()
|
||||
|
||||
def _analyze_types(self, groups: List[FunctionGroup]):
|
||||
for group in groups:
|
||||
for fn in group["entries"]:
|
||||
for param in fn["params"]:
|
||||
if param["type"] not in self.type_maps:
|
||||
self.type_maps[param["type"]] = self._get_casted_type(param["type"])
|
||||
if fn["return_type"] not in self.type_maps:
|
||||
self.type_maps[fn["return_type"]] = self._get_casted_type(fn["return_type"])
|
||||
|
||||
for k, v in self.type_maps.items():
|
||||
if v not in self.inverse_type_maps:
|
||||
self.inverse_type_maps[v] = []
|
||||
self.inverse_type_maps[v].append(k)
|
||||
return
|
||||
|
||||
def write_group_start(self, group):
|
||||
group = group.replace("_", " ").strip()
|
||||
# make group title uppercase
|
||||
group = " ".join([x.capitalize() for x in group.split(" ")])
|
||||
self.file.write(f"# {'-' * 80}\n")
|
||||
self.file.write(f"# {group}\n")
|
||||
self.file.write(f"# {'-' * 80}\n")
|
||||
|
||||
@staticmethod
|
||||
def get_function_order(filepath):
|
||||
path = pathlib.Path(filepath)
|
||||
if not path.exists() or not path.is_file():
|
||||
raise FileNotFoundError(f"File {path} does not exist")
|
||||
|
||||
with open(path, "r") as f:
|
||||
lines = f.readlines()
|
||||
|
||||
is_julia_file = path.suffix == ".jl"
|
||||
|
||||
if not is_julia_file:
|
||||
# read the file and assume that we have a function name per line
|
||||
return [x.strip() for x in lines if x.strip() != ""]
|
||||
|
||||
# find the function definitions
|
||||
# TODO this a very simple regex that only supports the long function form `function name(...)`
|
||||
function_regex = r"^function\s+([a-zA-Z_][a-zA-Z0-9_]*)\s*\("
|
||||
function_order = []
|
||||
for line in lines:
|
||||
line = line.strip()
|
||||
if line.startswith("#"):
|
||||
continue
|
||||
|
||||
m = re.match(function_regex, line)
|
||||
if m is not None:
|
||||
function_order.append(m.group(1))
|
||||
return function_order
|
||||
|
||||
|
||||
def main():
|
||||
"""Main function to generate the Julia API."""
|
||||
|
||||
print("Creating Julia API")
|
||||
|
||||
parser = configure_parser()
|
||||
args = parser.parse_args()
|
||||
print("Arguments:")
|
||||
for k, v in vars(args).items():
|
||||
print(f" {k}: {v}")
|
||||
|
||||
julia_path = pathlib.Path(args.output)
|
||||
enable_auto_1base_index = args.auto_1_index
|
||||
enable_original_order = args.use_original_order
|
||||
|
||||
capi_defintions_dir = pathlib.Path(args.capi_dir)
|
||||
ext_api_definition_pattern = str(capi_defintions_dir) + "/apis/v1/*/*.json"
|
||||
capi_function_definition_pattern = str(capi_defintions_dir) + "/functions/**/*.json"
|
||||
ext_api_definitions = parse_ext_api_definitions(ext_api_definition_pattern)
|
||||
ext_api_version = get_extension_api_version(ext_api_definitions)
|
||||
function_groups, function_map = parse_capi_function_definitions(capi_function_definition_pattern)
|
||||
|
||||
overwrite_function_signatures = {
|
||||
# Must be Ptr{Cvoid} and not Ref
|
||||
"duckdb_free": (
|
||||
"Cvoid",
|
||||
("Ptr{Cvoid}",),
|
||||
),
|
||||
"duckdb_bind_blob": (
|
||||
"duckdb_state",
|
||||
("duckdb_prepared_statement", "idx_t", "Ptr{Cvoid}", "idx_t"),
|
||||
),
|
||||
"duckdb_vector_assign_string_element_len": (
|
||||
"Cvoid",
|
||||
(
|
||||
"duckdb_vector",
|
||||
"idx_t",
|
||||
"Ptr{UInt8}",
|
||||
"idx_t",
|
||||
), # Must be Ptr{UInt8} instead of Cstring to allow '\0' in the middle
|
||||
),
|
||||
}
|
||||
|
||||
with JuliaApiTarget(
|
||||
julia_path,
|
||||
indent=0,
|
||||
auto_1base_index=enable_auto_1base_index, # WARNING: every arg named "col/row/index" or similar will be 1-based indexed, so the argument is subtracted by 1
|
||||
auto_1base_index_return_functions={"duckdb_init_get_column_index"},
|
||||
auto_1base_index_ignore_functions={
|
||||
"duckdb_parameter_name", # Parameter names start at 1
|
||||
"duckdb_param_type", # Parameter types (like names) start at 1
|
||||
"duckdb_param_logical_type", # ...
|
||||
"duckdb_bind_get_parameter", # Would be breaking API change
|
||||
},
|
||||
skipped_functions={},
|
||||
type_map=JULIA_BASE_TYPE_MAP,
|
||||
overwrite_function_signatures=overwrite_function_signatures,
|
||||
) as printer:
|
||||
if enable_original_order:
|
||||
print("INFO: Using the original order of the functions from the old API file.")
|
||||
printer.manual_order = JULIA_API_ORIGINAL_ORDER
|
||||
|
||||
printer.write_functions(ext_api_version, function_groups, function_map)
|
||||
|
||||
if args.print_type_mapping:
|
||||
print("Type maps: (Julia Type -> C Type)")
|
||||
K = list(printer.inverse_type_maps.keys())
|
||||
K.sort()
|
||||
for k in K:
|
||||
if k.startswith("Ptr") or k.startswith("Ref"):
|
||||
continue
|
||||
v = ", ".join(printer.inverse_type_maps[k])
|
||||
print(f" {k} -> {v}")
|
||||
|
||||
print("Julia API generated successfully!")
|
||||
print("Please review the mapped types and check the generated file:")
|
||||
print("Hint: also run './format.sh' to format the file and reduce the diff.")
|
||||
print(f"Output: {julia_path}")
|
||||
|
||||
|
||||
def configure_parser():
|
||||
parser = argparse.ArgumentParser(description="Generate the DuckDB Julia API")
|
||||
parser.add_argument(
|
||||
"--auto-1-index",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help="Automatically convert 0-based indices to 1-based indices",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--use-original-order",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="Use the original order of the functions from the old API file. New functions will be appended at the end.",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--print-type-mapping",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="Print the type mapping from C to Julia",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--capi-dir",
|
||||
type=str,
|
||||
required=True,
|
||||
help="Path to the input C API definitions. Should be a directory containing JSON files.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"output",
|
||||
type=str,
|
||||
# default="src/api.jl",
|
||||
help="Path to the output file",
|
||||
)
|
||||
return parser
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
41
external/duckdb/tools/juliapkg/src/DuckDB.jl
vendored
Normal file
41
external/duckdb/tools/juliapkg/src/DuckDB.jl
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
module DuckDB
|
||||
|
||||
using DBInterface
|
||||
using WeakRefStrings
|
||||
using Tables
|
||||
using Base.Libc
|
||||
using Dates
|
||||
using Tables
|
||||
using UUIDs
|
||||
using FixedPointDecimals
|
||||
|
||||
export DBInterface, DuckDBException
|
||||
|
||||
abstract type ResultType end
|
||||
struct MaterializedResult <: ResultType end
|
||||
struct StreamResult <: ResultType end
|
||||
|
||||
include("helper.jl")
|
||||
include("exceptions.jl")
|
||||
include("ctypes.jl")
|
||||
include("api.jl")
|
||||
include("api_helper.jl")
|
||||
include("logical_type.jl")
|
||||
include("value.jl")
|
||||
include("validity_mask.jl")
|
||||
include("vector.jl")
|
||||
include("data_chunk.jl")
|
||||
include("config.jl")
|
||||
include("database.jl")
|
||||
include("statement.jl")
|
||||
include("result.jl")
|
||||
include("transaction.jl")
|
||||
include("ddl.jl")
|
||||
include("appender.jl")
|
||||
include("table_function.jl")
|
||||
include("scalar_function.jl")
|
||||
include("replacement_scan.jl")
|
||||
include("table_scan.jl")
|
||||
include("old_interface.jl")
|
||||
|
||||
end # module
|
||||
8254
external/duckdb/tools/juliapkg/src/api.jl
vendored
Normal file
8254
external/duckdb/tools/juliapkg/src/api.jl
vendored
Normal file
File diff suppressed because it is too large
Load Diff
30
external/duckdb/tools/juliapkg/src/api_helper.jl
vendored
Normal file
30
external/duckdb/tools/juliapkg/src/api_helper.jl
vendored
Normal file
@@ -0,0 +1,30 @@
|
||||
|
||||
|
||||
"""
|
||||
duckdb_free(s::Cstring)
|
||||
|
||||
Free a Cstring allocated by DuckDB. This function is a wrapper around `duckdb_free`.
|
||||
"""
|
||||
function duckdb_free(s::Cstring)
|
||||
p = pointer(s)
|
||||
return duckdb_free(p)
|
||||
end
|
||||
|
||||
"""
|
||||
Retrieves the member vector of a union vector.
|
||||
|
||||
The resulting vector is valid as long as the parent vector is valid.
|
||||
|
||||
* vector: The vector
|
||||
* index: The member index
|
||||
* returns: The member vector
|
||||
"""
|
||||
function duckdb_union_vector_get_member(vector, index)
|
||||
return ccall(
|
||||
(:duckdb_struct_vector_get_child, libduckdb),
|
||||
duckdb_vector,
|
||||
(duckdb_vector, UInt64),
|
||||
vector,
|
||||
1 + (index - 1)
|
||||
)
|
||||
end
|
||||
131
external/duckdb/tools/juliapkg/src/appender.jl
vendored
Normal file
131
external/duckdb/tools/juliapkg/src/appender.jl
vendored
Normal file
@@ -0,0 +1,131 @@
|
||||
using Dates
|
||||
|
||||
"""
|
||||
Appender(db_connection, table, [schema])
|
||||
|
||||
An appender object that can be used to append rows to an existing table.
|
||||
|
||||
* DateTime objects in Julia are stored in milliseconds since the Unix epoch but are converted to microseconds when stored in duckdb.
|
||||
* Time objects in Julia are stored in nanoseconds since midnight but are converted to microseconds when stored in duckdb.
|
||||
* Missing and Nothing are stored as NULL in duckdb, but will be converted to Missing when the data is queried back.
|
||||
|
||||
# Example
|
||||
```julia
|
||||
using DuckDB, DataFrames, Dates
|
||||
db = DuckDB.DB()
|
||||
|
||||
# create a table
|
||||
DBInterface.execute(db, "CREATE OR REPLACE TABLE data(id INT PRIMARY KEY, value FLOAT, timestamp TIMESTAMP, date DATE)")
|
||||
|
||||
# data to insert
|
||||
len = 100
|
||||
df = DataFrames.DataFrame(id=collect(1:len),
|
||||
value=rand(len),
|
||||
timestamp=Dates.now() + Dates.Second.(1:len),
|
||||
date=Dates.today() + Dates.Day.(1:len))
|
||||
|
||||
# append data by row
|
||||
appender = DuckDB.Appender(db, "data")
|
||||
for i in eachrow(df)
|
||||
for j in i
|
||||
DuckDB.append(appender, j)
|
||||
end
|
||||
DuckDB.end_row(appender)
|
||||
end
|
||||
# flush the appender after all rows
|
||||
DuckDB.flush(appender)
|
||||
DuckDB.close(appender)
|
||||
```
|
||||
"""
|
||||
mutable struct Appender
|
||||
handle::duckdb_appender
|
||||
|
||||
function Appender(con::Connection, table::AbstractString, schema::Union{AbstractString, Nothing} = nothing)
|
||||
handle = Ref{duckdb_appender}()
|
||||
if duckdb_appender_create(con.handle, something(schema, C_NULL), table, handle) != DuckDBSuccess
|
||||
error_ptr = duckdb_appender_error(handle)
|
||||
if error_ptr == C_NULL
|
||||
error_message = string("Opening of Appender for table \"", table, "\" failed: unknown error")
|
||||
else
|
||||
error_message = string(error_ptr)
|
||||
end
|
||||
duckdb_appender_destroy(handle)
|
||||
throw(QueryException(error_message))
|
||||
end
|
||||
con = new(handle[])
|
||||
finalizer(_close_appender, con)
|
||||
return con
|
||||
end
|
||||
function Appender(db::DB, table::AbstractString, schema::Union{AbstractString, Nothing} = nothing)
|
||||
return Appender(db.main_connection, table, schema)
|
||||
end
|
||||
end
|
||||
|
||||
function _close_appender(appender::Appender)
|
||||
if appender.handle != C_NULL
|
||||
duckdb_appender_destroy(appender.handle)
|
||||
end
|
||||
appender.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
function close(appender::Appender)
|
||||
_close_appender(appender)
|
||||
return
|
||||
end
|
||||
|
||||
append(appender::Appender, val::AbstractFloat) = duckdb_append_double(appender.handle, Float64(val));
|
||||
append(appender::Appender, val::Bool) = duckdb_append_bool(appender.handle, val);
|
||||
append(appender::Appender, val::Int8) = duckdb_append_int8(appender.handle, val);
|
||||
append(appender::Appender, val::Int16) = duckdb_append_int16(appender.handle, val);
|
||||
append(appender::Appender, val::Int32) = duckdb_append_int32(appender.handle, val);
|
||||
append(appender::Appender, val::Int64) = duckdb_append_int64(appender.handle, val);
|
||||
append(appender::Appender, val::Int128) = duckdb_append_hugeint(appender.handle, val);
|
||||
append(appender::Appender, val::UInt128) = duckdb_append_uhugeint(appender.handle, val);
|
||||
append(appender::Appender, val::UInt8) = duckdb_append_uint8(appender.handle, val);
|
||||
append(appender::Appender, val::UInt16) = duckdb_append_uint16(appender.handle, val);
|
||||
append(appender::Appender, val::UInt32) = duckdb_append_uint32(appender.handle, val);
|
||||
append(appender::Appender, val::UInt64) = duckdb_append_uint64(appender.handle, val);
|
||||
append(appender::Appender, val::Float32) = duckdb_append_float(appender.handle, val);
|
||||
append(appender::Appender, val::Float64) = duckdb_append_double(appender.handle, val);
|
||||
append(appender::Appender, ::Union{Missing, Nothing}) = duckdb_append_null(appender.handle);
|
||||
append(appender::Appender, val::AbstractString) = duckdb_append_varchar(appender.handle, val);
|
||||
append(appender::Appender, val::Base.UUID) = append(appender, string(val));
|
||||
append(appender::Appender, val::Vector{UInt8}) = duckdb_append_blob(appender.handle, val, sizeof(val));
|
||||
append(appender::Appender, val::FixedDecimal) = append(appender, string(val));
|
||||
# append(appender::Appender, val::WeakRefString{UInt8}) = duckdb_append_varchar(stmt.handle, i, val.ptr, val.len);
|
||||
append(appender::Appender, val::Date) =
|
||||
duckdb_append_date(appender.handle, Dates.date2epochdays(val) - ROUNDING_EPOCH_TO_UNIX_EPOCH_DAYS);
|
||||
# nanosecond to microseconds
|
||||
append(appender::Appender, val::Time) = duckdb_append_time(appender.handle, Dates.value(val) ÷ 1000);
|
||||
|
||||
# milliseconds to microseconds
|
||||
append(appender::Appender, val::DateTime) =
|
||||
duckdb_append_timestamp(appender.handle, (Dates.datetime2epochms(val) - ROUNDING_EPOCH_TO_UNIX_EPOCH_MS) * 1000);
|
||||
|
||||
function append(appender::Appender, val::AbstractVector{T}) where {T}
|
||||
value = create_value(val)
|
||||
if length(val) == 0
|
||||
duckdb_append_null(appender.handle)
|
||||
else
|
||||
duckdb_append_value(appender.handle, value.handle)
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
function append(appender::Appender, val::Any)
|
||||
println(val)
|
||||
throw(NotImplementedException("unsupported type for append"))
|
||||
end
|
||||
|
||||
function end_row(appender::Appender)
|
||||
duckdb_appender_end_row(appender.handle)
|
||||
return
|
||||
end
|
||||
|
||||
function flush(appender::Appender)
|
||||
duckdb_appender_flush(appender.handle)
|
||||
return
|
||||
end
|
||||
|
||||
DBInterface.close!(appender::Appender) = _close_appender(appender)
|
||||
48
external/duckdb/tools/juliapkg/src/config.jl
vendored
Normal file
48
external/duckdb/tools/juliapkg/src/config.jl
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
"""
|
||||
Configuration object
|
||||
"""
|
||||
mutable struct Config
|
||||
handle::duckdb_config
|
||||
|
||||
function Config(args...; kwargs...)
|
||||
handle = Ref{duckdb_connection}()
|
||||
duckdb_create_config(handle)
|
||||
|
||||
result = new(handle[])
|
||||
finalizer(_destroy_config, result)
|
||||
|
||||
_fill_config!(result, args...; kwargs...)
|
||||
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _destroy_config(config::Config)
|
||||
if config.handle != C_NULL
|
||||
duckdb_destroy_config(config.handle)
|
||||
end
|
||||
config.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
DBInterface.close!(config::Config) = _destroy_config(config)
|
||||
|
||||
function Base.setindex!(config::Config, option::AbstractString, name::AbstractString)
|
||||
if duckdb_set_config(config.handle, name, option) != DuckDBSuccess
|
||||
throw(QueryException(string("Unrecognized configuration option \"", name, "\"")))
|
||||
end
|
||||
end
|
||||
|
||||
@deprecate set_config(config::Config, name::AbstractString, option::AbstractString) setindex!(config, option, name)
|
||||
|
||||
_fill_config!(config, options::AbstractVector) =
|
||||
for (name, option) in options
|
||||
config[name] = option
|
||||
end
|
||||
|
||||
_fill_config!(config, options::Union{NamedTuple, AbstractDict}) =
|
||||
for (name, option) in pairs(options)
|
||||
config[string(name)] = option
|
||||
end
|
||||
|
||||
_fill_config!(config; kwargs...) = _fill_config!(config, NamedTuple(kwargs))
|
||||
582
external/duckdb/tools/juliapkg/src/ctypes.jl
vendored
Normal file
582
external/duckdb/tools/juliapkg/src/ctypes.jl
vendored
Normal file
@@ -0,0 +1,582 @@
|
||||
const STRING_INLINE_LENGTH = 12 # length of the inline string in duckdb_string_t
|
||||
const idx_t = UInt64 # DuckDB index type
|
||||
|
||||
const duckdb_aggregate_combine = Ptr{Cvoid}
|
||||
const duckdb_aggregate_destroy = Ptr{Cvoid}
|
||||
const duckdb_aggregate_finalize = Ptr{Cvoid}
|
||||
const duckdb_aggregate_function = Ptr{Cvoid}
|
||||
const duckdb_aggregate_function_set = Ptr{Cvoid}
|
||||
const duckdb_aggregate_init = Ptr{Cvoid}
|
||||
const duckdb_aggregate_state_size = Ptr{Cvoid}
|
||||
const duckdb_aggregate_update = Ptr{Cvoid}
|
||||
const duckdb_appender = Ptr{Cvoid}
|
||||
const duckdb_arrow = Ptr{Cvoid}
|
||||
const duckdb_arrow_array = Ptr{Cvoid}
|
||||
const duckdb_arrow_schema = Ptr{Cvoid}
|
||||
const duckdb_arrow_stream = Ptr{Cvoid}
|
||||
const duckdb_bind_info = Ptr{Cvoid}
|
||||
const duckdb_cast_function = Ptr{Cvoid}
|
||||
const duckdb_cast_function_ptr = Ptr{Cvoid}
|
||||
const duckdb_client_context = Ptr{Cvoid}
|
||||
const duckdb_config = Ptr{Cvoid}
|
||||
const duckdb_connection = Ptr{Cvoid}
|
||||
const duckdb_create_type_info = Ptr{Cvoid}
|
||||
const duckdb_data_chunk = Ptr{Cvoid}
|
||||
const duckdb_database = Ptr{Cvoid}
|
||||
const duckdb_delete_callback = Ptr{Cvoid}
|
||||
const duckdb_extracted_statements = Ptr{Cvoid}
|
||||
const duckdb_function_info = Ptr{Cvoid}
|
||||
const duckdb_init_info = Ptr{Cvoid}
|
||||
const duckdb_instance_cache = Ptr{Cvoid}
|
||||
const duckdb_logical_type = Ptr{Cvoid}
|
||||
const duckdb_pending_result = Ptr{Cvoid}
|
||||
const duckdb_prepared_statement = Ptr{Cvoid}
|
||||
const duckdb_profiling_info = Ptr{Cvoid}
|
||||
const duckdb_replacement_callback = Ptr{Cvoid}
|
||||
const duckdb_replacement_scan_info = Ptr{Cvoid}
|
||||
const duckdb_scalar_function = Ptr{Cvoid}
|
||||
const duckdb_scalar_function_bind = Ptr{Cvoid}
|
||||
const duckdb_scalar_function_set = Ptr{Cvoid}
|
||||
const duckdb_selection_vector = Ptr{Cvoid}
|
||||
const duckdb_table_description = Ptr{Cvoid}
|
||||
const duckdb_table_function = Ptr{Cvoid}
|
||||
const duckdb_table_function_ptr = Ptr{Cvoid}
|
||||
const duckdb_table_function_bind = Ptr{Cvoid}
|
||||
const duckdb_table_function_init = Ptr{Cvoid}
|
||||
const duckdb_task_state = Ptr{Cvoid}
|
||||
const duckdb_value = Ptr{Cvoid}
|
||||
const duckdb_vector = Ptr{Cvoid}
|
||||
|
||||
|
||||
|
||||
const duckdb_state = Cint;
|
||||
const DuckDBSuccess = 0;
|
||||
const DuckDBError = 1;
|
||||
|
||||
const duckdb_pending_state = Cint;
|
||||
const DUCKDB_PENDING_RESULT_READY = 0;
|
||||
const DUCKDB_PENDING_RESULT_NOT_READY = 1;
|
||||
const DUCKDB_PENDING_ERROR = 2;
|
||||
const DUCKDB_PENDING_NO_TASKS_AVAILABLE = 3;
|
||||
|
||||
@enum DUCKDB_RESULT_TYPE_::Cint begin
|
||||
DUCKDB_RESULT_TYPE_INVALID = 0
|
||||
DUCKDB_RESULT_TYPE_CHANGED_ROWS = 1
|
||||
DUCKDB_RESULT_TYPE_NOTHING = 2
|
||||
DUCKDB_RESULT_TYPE_QUERY_RESULT = 3
|
||||
end
|
||||
const duckdb_result_type = DUCKDB_RESULT_TYPE_;
|
||||
|
||||
|
||||
@enum DUCKDB_STATEMENT_TYPE_::Cint begin
|
||||
DUCKDB_STATEMENT_TYPE_INVALID = 0
|
||||
DUCKDB_STATEMENT_TYPE_SELECT = 1
|
||||
DUCKDB_STATEMENT_TYPE_INSERT = 2
|
||||
DUCKDB_STATEMENT_TYPE_UPDATE = 3
|
||||
DUCKDB_STATEMENT_TYPE_EXPLAIN = 4
|
||||
DUCKDB_STATEMENT_TYPE_DELETE = 5
|
||||
DUCKDB_STATEMENT_TYPE_PREPARE = 6
|
||||
DUCKDB_STATEMENT_TYPE_CREATE = 7
|
||||
DUCKDB_STATEMENT_TYPE_EXECUTE = 8
|
||||
DUCKDB_STATEMENT_TYPE_ALTER = 9
|
||||
DUCKDB_STATEMENT_TYPE_TRANSACTION = 10
|
||||
DUCKDB_STATEMENT_TYPE_COPY = 11
|
||||
DUCKDB_STATEMENT_TYPE_ANALYZE = 12
|
||||
DUCKDB_STATEMENT_TYPE_VARIABLE_SET = 13
|
||||
DUCKDB_STATEMENT_TYPE_CREATE_FUNC = 14
|
||||
DUCKDB_STATEMENT_TYPE_DROP = 15
|
||||
DUCKDB_STATEMENT_TYPE_EXPORT = 16
|
||||
DUCKDB_STATEMENT_TYPE_PRAGMA = 17
|
||||
DUCKDB_STATEMENT_TYPE_VACUUM = 18
|
||||
DUCKDB_STATEMENT_TYPE_CALL = 19
|
||||
DUCKDB_STATEMENT_TYPE_SET = 20
|
||||
DUCKDB_STATEMENT_TYPE_LOAD = 21
|
||||
DUCKDB_STATEMENT_TYPE_RELATION = 22
|
||||
DUCKDB_STATEMENT_TYPE_EXTENSION = 23
|
||||
DUCKDB_STATEMENT_TYPE_LOGICAL_PLAN = 24
|
||||
DUCKDB_STATEMENT_TYPE_ATTACH = 25
|
||||
DUCKDB_STATEMENT_TYPE_DETACH = 26
|
||||
DUCKDB_STATEMENT_TYPE_MULTI = 27
|
||||
end
|
||||
const duckdb_statement_type = DUCKDB_STATEMENT_TYPE_
|
||||
|
||||
@enum DUCKDB_ERROR_TYPE_::Cint begin
|
||||
DUCKDB_ERROR_INVALID = 0
|
||||
DUCKDB_ERROR_OUT_OF_RANGE = 1
|
||||
DUCKDB_ERROR_CONVERSION = 2
|
||||
DUCKDB_ERROR_UNKNOWN_TYPE = 3
|
||||
DUCKDB_ERROR_DECIMAL = 4
|
||||
DUCKDB_ERROR_MISMATCH_TYPE = 5
|
||||
DUCKDB_ERROR_DIVIDE_BY_ZERO = 6
|
||||
DUCKDB_ERROR_OBJECT_SIZE = 7
|
||||
DUCKDB_ERROR_INVALID_TYPE = 8
|
||||
DUCKDB_ERROR_SERIALIZATION = 9
|
||||
DUCKDB_ERROR_TRANSACTION = 10
|
||||
DUCKDB_ERROR_NOT_IMPLEMENTED = 11
|
||||
DUCKDB_ERROR_EXPRESSION = 12
|
||||
DUCKDB_ERROR_CATALOG = 13
|
||||
DUCKDB_ERROR_PARSER = 14
|
||||
DUCKDB_ERROR_PLANNER = 15
|
||||
DUCKDB_ERROR_SCHEDULER = 16
|
||||
DUCKDB_ERROR_EXECUTOR = 17
|
||||
DUCKDB_ERROR_CONSTRAINT = 18
|
||||
DUCKDB_ERROR_INDEX = 19
|
||||
DUCKDB_ERROR_STAT = 20
|
||||
DUCKDB_ERROR_CONNECTION = 21
|
||||
DUCKDB_ERROR_SYNTAX = 22
|
||||
DUCKDB_ERROR_SETTINGS = 23
|
||||
DUCKDB_ERROR_BINDER = 24
|
||||
DUCKDB_ERROR_NETWORK = 25
|
||||
DUCKDB_ERROR_OPTIMIZER = 26
|
||||
DUCKDB_ERROR_NULL_POINTER = 27
|
||||
DUCKDB_ERROR_IO = 28
|
||||
DUCKDB_ERROR_INTERRUPT = 29
|
||||
DUCKDB_ERROR_FATAL = 30
|
||||
DUCKDB_ERROR_INTERNAL = 31
|
||||
DUCKDB_ERROR_INVALID_INPUT = 32
|
||||
DUCKDB_ERROR_OUT_OF_MEMORY = 33
|
||||
DUCKDB_ERROR_PERMISSION = 34
|
||||
DUCKDB_ERROR_PARAMETER_NOT_RESOLVED = 35
|
||||
DUCKDB_ERROR_PARAMETER_NOT_ALLOWED = 36
|
||||
DUCKDB_ERROR_DEPENDENCY = 37
|
||||
DUCKDB_ERROR_HTTP = 38
|
||||
DUCKDB_ERROR_MISSING_EXTENSION = 39
|
||||
DUCKDB_ERROR_AUTOLOAD = 40
|
||||
DUCKDB_ERROR_SEQUENCE = 41
|
||||
DUCKDB_INVALID_CONFIGURATION = 42
|
||||
end
|
||||
const duckdb_error_type = DUCKDB_ERROR_TYPE_
|
||||
|
||||
@enum DUCKDB_CAST_MODE_::Cint begin
|
||||
DUCKDB_CAST_NORMAL = 0
|
||||
DUCKDB_CAST_TRY = 1
|
||||
end
|
||||
const duckdb_cast_mode = DUCKDB_CAST_MODE_
|
||||
|
||||
@enum DUCKDB_TYPE_::Cint begin
|
||||
DUCKDB_TYPE_INVALID = 0
|
||||
DUCKDB_TYPE_BOOLEAN = 1
|
||||
DUCKDB_TYPE_TINYINT = 2
|
||||
DUCKDB_TYPE_SMALLINT = 3
|
||||
DUCKDB_TYPE_INTEGER = 4
|
||||
DUCKDB_TYPE_BIGINT = 5
|
||||
DUCKDB_TYPE_UTINYINT = 6
|
||||
DUCKDB_TYPE_USMALLINT = 7
|
||||
DUCKDB_TYPE_UINTEGER = 8
|
||||
DUCKDB_TYPE_UBIGINT = 9
|
||||
DUCKDB_TYPE_FLOAT = 10
|
||||
DUCKDB_TYPE_DOUBLE = 11
|
||||
DUCKDB_TYPE_TIMESTAMP = 12
|
||||
DUCKDB_TYPE_DATE = 13
|
||||
DUCKDB_TYPE_TIME = 14
|
||||
DUCKDB_TYPE_INTERVAL = 15
|
||||
DUCKDB_TYPE_HUGEINT = 16
|
||||
DUCKDB_TYPE_UHUGEINT = 32
|
||||
DUCKDB_TYPE_VARCHAR = 17
|
||||
DUCKDB_TYPE_BLOB = 18
|
||||
DUCKDB_TYPE_DECIMAL = 19
|
||||
DUCKDB_TYPE_TIMESTAMP_S = 20
|
||||
DUCKDB_TYPE_TIMESTAMP_MS = 21
|
||||
DUCKDB_TYPE_TIMESTAMP_NS = 22
|
||||
DUCKDB_TYPE_ENUM = 23
|
||||
DUCKDB_TYPE_LIST = 24
|
||||
DUCKDB_TYPE_STRUCT = 25
|
||||
DUCKDB_TYPE_MAP = 26
|
||||
DUCKDB_TYPE_UUID = 27
|
||||
DUCKDB_TYPE_UNION = 28
|
||||
DUCKDB_TYPE_BIT = 29
|
||||
DUCKDB_TYPE_TIME_TZ = 30
|
||||
DUCKDB_TYPE_TIMESTAMP_TZ = 31
|
||||
DUCKDB_TYPE_ARRAY = 33
|
||||
DUCKDB_TYPE_ANY = 34
|
||||
DUCKDB_TYPE_BIGNUM = 35
|
||||
DUCKDB_TYPE_SQLNULL = 36
|
||||
DUCKDB_TYPE_STRING_LITERAL = 37
|
||||
DUCKDB_TYPE_INTEGER_LITERAL = 38
|
||||
end
|
||||
const DUCKDB_TYPE = DUCKDB_TYPE_
|
||||
|
||||
|
||||
"""
|
||||
Days are stored as days since 1970-01-01\n
|
||||
Use the duckdb_from_date/duckdb_to_date function to extract individual information
|
||||
|
||||
"""
|
||||
struct duckdb_date
|
||||
days::Int32
|
||||
end
|
||||
|
||||
|
||||
|
||||
struct duckdb_date_struct
|
||||
year::Int32
|
||||
month::Int8
|
||||
day::Int8
|
||||
end
|
||||
|
||||
"""
|
||||
Time is stored as microseconds since 00:00:00\n
|
||||
Use the duckdb_from_time/duckdb_to_time function to extract individual information
|
||||
|
||||
"""
|
||||
struct duckdb_time
|
||||
micros::Int64
|
||||
end
|
||||
|
||||
struct duckdb_time_struct
|
||||
hour::Int8
|
||||
min::Int8
|
||||
sec::Int8
|
||||
micros::Int32
|
||||
end
|
||||
|
||||
|
||||
struct duckdb_time_tz
|
||||
bits::UInt64
|
||||
end
|
||||
|
||||
struct duckdb_time_tz_struct
|
||||
time::duckdb_time_struct
|
||||
offset::Int32
|
||||
end
|
||||
|
||||
"""
|
||||
Timestamps are stored as microseconds since 1970-01-01\n
|
||||
Use the duckdb_from_timestamp/duckdb_to_timestamp function to extract individual information
|
||||
|
||||
"""
|
||||
struct duckdb_timestamp
|
||||
micros::Int64
|
||||
end
|
||||
|
||||
struct duckdb_timestamp_s
|
||||
seconds::Int64
|
||||
end
|
||||
|
||||
struct duckdb_timestamp_ms
|
||||
millis::Int64
|
||||
end
|
||||
|
||||
struct duckdb_timestamp_ns
|
||||
nanos::Int64
|
||||
end
|
||||
|
||||
|
||||
struct duckdb_timestamp_struct
|
||||
date::duckdb_date_struct
|
||||
time::duckdb_time_struct
|
||||
end
|
||||
|
||||
struct duckdb_interval
|
||||
months::Int32
|
||||
days::Int32
|
||||
micros::Int64
|
||||
end
|
||||
|
||||
"""
|
||||
Hugeints are composed in a (lower, upper) component\n
|
||||
The value of the hugeint is upper * 2^64 + lower\n
|
||||
For easy usage, the functions duckdb_hugeint_to_double/duckdb_double_to_hugeint are recommended
|
||||
|
||||
"""
|
||||
struct duckdb_hugeint
|
||||
lower::UInt64
|
||||
upper::Int64
|
||||
end
|
||||
|
||||
struct duckdb_uhugeint
|
||||
lower::UInt64
|
||||
upper::UInt64
|
||||
end
|
||||
|
||||
"""
|
||||
Decimals are composed of a width and a scale, and are stored in a hugeint
|
||||
"""
|
||||
struct duckdb_decimal
|
||||
width::UInt8
|
||||
scale::UInt8
|
||||
value::duckdb_hugeint
|
||||
end
|
||||
struct duckdb_string_t
|
||||
length::UInt32
|
||||
data::NTuple{STRING_INLINE_LENGTH, UInt8}
|
||||
end
|
||||
|
||||
struct duckdb_string_t_ptr
|
||||
length::UInt32
|
||||
prefix::NTuple{4, UInt8} # 4 bytes prefix
|
||||
data::Cstring
|
||||
end
|
||||
|
||||
struct duckdb_list_entry_t
|
||||
offset::UInt64
|
||||
length::UInt64
|
||||
end
|
||||
|
||||
struct duckdb_query_progress_type
|
||||
percentage::Float64
|
||||
rows_processed::UInt64
|
||||
total_rows_to_process::UInt64
|
||||
end
|
||||
|
||||
struct duckdb_bignum
|
||||
data::Ptr{UInt8}
|
||||
size::idx_t
|
||||
is_negative::Bool
|
||||
end
|
||||
|
||||
struct duckdb_column
|
||||
__deprecated_data::Ptr{Cvoid}
|
||||
__deprecated_nullmask::Ptr{UInt8}
|
||||
__deprecated_type::Ptr{DUCKDB_TYPE}
|
||||
__deprecated_name::Ptr{UInt8}
|
||||
internal_data::Ptr{Cvoid}
|
||||
end
|
||||
|
||||
struct duckdb_result
|
||||
__deprecated_column_count::Ptr{UInt64}
|
||||
__deprecated_row_count::Ptr{UInt64}
|
||||
__deprecated_rows_changed::Ptr{UInt64}
|
||||
__deprecated_columns::Ptr{duckdb_column}
|
||||
__deprecated_error_message::Ptr{UInt8}
|
||||
internal_data::Ptr{Cvoid}
|
||||
end
|
||||
|
||||
INTERNAL_TYPE_MAP = Dict(
|
||||
DUCKDB_TYPE_BOOLEAN => Bool,
|
||||
DUCKDB_TYPE_TINYINT => Int8,
|
||||
DUCKDB_TYPE_SMALLINT => Int16,
|
||||
DUCKDB_TYPE_INTEGER => Int32,
|
||||
DUCKDB_TYPE_BIGINT => Int64,
|
||||
DUCKDB_TYPE_UTINYINT => UInt8,
|
||||
DUCKDB_TYPE_USMALLINT => UInt16,
|
||||
DUCKDB_TYPE_UINTEGER => UInt32,
|
||||
DUCKDB_TYPE_UBIGINT => UInt64,
|
||||
DUCKDB_TYPE_FLOAT => Float32,
|
||||
DUCKDB_TYPE_DOUBLE => Float64,
|
||||
DUCKDB_TYPE_TIMESTAMP => duckdb_timestamp,
|
||||
DUCKDB_TYPE_TIMESTAMP_S => duckdb_timestamp_s,
|
||||
DUCKDB_TYPE_TIMESTAMP_MS => duckdb_timestamp_ms,
|
||||
DUCKDB_TYPE_TIMESTAMP_NS => duckdb_timestamp_ns,
|
||||
DUCKDB_TYPE_TIMESTAMP_TZ => duckdb_timestamp,
|
||||
DUCKDB_TYPE_DATE => duckdb_date,
|
||||
DUCKDB_TYPE_TIME => duckdb_time,
|
||||
DUCKDB_TYPE_TIME_TZ => duckdb_time_tz,
|
||||
DUCKDB_TYPE_INTERVAL => duckdb_interval,
|
||||
DUCKDB_TYPE_HUGEINT => duckdb_hugeint,
|
||||
DUCKDB_TYPE_UHUGEINT => duckdb_uhugeint,
|
||||
DUCKDB_TYPE_UUID => duckdb_hugeint,
|
||||
DUCKDB_TYPE_VARCHAR => duckdb_string_t,
|
||||
DUCKDB_TYPE_BLOB => duckdb_string_t,
|
||||
DUCKDB_TYPE_BIT => duckdb_string_t,
|
||||
DUCKDB_TYPE_UUID => duckdb_hugeint,
|
||||
DUCKDB_TYPE_LIST => duckdb_list_entry_t,
|
||||
DUCKDB_TYPE_STRUCT => Cvoid,
|
||||
DUCKDB_TYPE_MAP => duckdb_list_entry_t,
|
||||
DUCKDB_TYPE_UNION => Cvoid
|
||||
)
|
||||
|
||||
JULIA_TYPE_MAP = Dict(
|
||||
DUCKDB_TYPE_INVALID => Missing,
|
||||
DUCKDB_TYPE_BOOLEAN => Bool,
|
||||
DUCKDB_TYPE_TINYINT => Int8,
|
||||
DUCKDB_TYPE_SMALLINT => Int16,
|
||||
DUCKDB_TYPE_INTEGER => Int32,
|
||||
DUCKDB_TYPE_BIGINT => Int64,
|
||||
DUCKDB_TYPE_HUGEINT => Int128,
|
||||
DUCKDB_TYPE_UHUGEINT => UInt128,
|
||||
DUCKDB_TYPE_UTINYINT => UInt8,
|
||||
DUCKDB_TYPE_USMALLINT => UInt16,
|
||||
DUCKDB_TYPE_UINTEGER => UInt32,
|
||||
DUCKDB_TYPE_UBIGINT => UInt64,
|
||||
DUCKDB_TYPE_FLOAT => Float32,
|
||||
DUCKDB_TYPE_DOUBLE => Float64,
|
||||
DUCKDB_TYPE_DATE => Date,
|
||||
DUCKDB_TYPE_TIME => Time,
|
||||
DUCKDB_TYPE_TIME_TZ => Time,
|
||||
DUCKDB_TYPE_TIMESTAMP => DateTime,
|
||||
DUCKDB_TYPE_TIMESTAMP_TZ => DateTime,
|
||||
DUCKDB_TYPE_TIMESTAMP_S => DateTime,
|
||||
DUCKDB_TYPE_TIMESTAMP_MS => DateTime,
|
||||
DUCKDB_TYPE_TIMESTAMP_NS => DateTime,
|
||||
DUCKDB_TYPE_INTERVAL => Dates.CompoundPeriod,
|
||||
DUCKDB_TYPE_UUID => UUID,
|
||||
DUCKDB_TYPE_VARCHAR => String,
|
||||
DUCKDB_TYPE_ENUM => String,
|
||||
DUCKDB_TYPE_BLOB => Base.CodeUnits{UInt8, String},
|
||||
DUCKDB_TYPE_BIT => Base.CodeUnits{UInt8, String},
|
||||
DUCKDB_TYPE_MAP => Dict
|
||||
)
|
||||
|
||||
# convert a DuckDB type into Julia equivalent
|
||||
function duckdb_type_to_internal_type(x::DUCKDB_TYPE)
|
||||
if !haskey(INTERNAL_TYPE_MAP, x)
|
||||
throw(NotImplementedException(string("Unsupported type for duckdb_type_to_internal_type: ", x)))
|
||||
end
|
||||
return INTERNAL_TYPE_MAP[x]
|
||||
end
|
||||
|
||||
function duckdb_type_to_julia_type(x)
|
||||
type_id = get_type_id(x)
|
||||
if type_id == DUCKDB_TYPE_DECIMAL
|
||||
internal_type_id = get_internal_type_id(x)
|
||||
scale = get_decimal_scale(x)
|
||||
if internal_type_id == DUCKDB_TYPE_SMALLINT
|
||||
return FixedDecimal{Int16, scale}
|
||||
elseif internal_type_id == DUCKDB_TYPE_INTEGER
|
||||
return FixedDecimal{Int32, scale}
|
||||
elseif internal_type_id == DUCKDB_TYPE_BIGINT
|
||||
return FixedDecimal{Int64, scale}
|
||||
elseif internal_type_id == DUCKDB_TYPE_HUGEINT
|
||||
return FixedDecimal{Int128, scale}
|
||||
else
|
||||
throw(NotImplementedException("Unimplemented internal type for decimal"))
|
||||
end
|
||||
elseif type_id == DUCKDB_TYPE_LIST
|
||||
return Vector{Union{Missing, duckdb_type_to_julia_type(get_list_child_type(x))}}
|
||||
elseif type_id == DUCKDB_TYPE_STRUCT
|
||||
child_count = get_struct_child_count(x)
|
||||
struct_names::Vector{Symbol} = Vector()
|
||||
for i in 1:child_count
|
||||
child_name::Symbol = Symbol(get_struct_child_name(x, i))
|
||||
push!(struct_names, child_name)
|
||||
end
|
||||
struct_names_tuple = Tuple(x for x in struct_names)
|
||||
return Union{Missing, NamedTuple{struct_names_tuple}}
|
||||
elseif type_id == DUCKDB_TYPE_UNION
|
||||
member_count = get_union_member_count(x)
|
||||
member_types::Vector{DataType} = Vector()
|
||||
for i in 1:member_count
|
||||
member_type::DataType = duckdb_type_to_julia_type(get_union_member_type(x, i))
|
||||
push!(member_types, member_type)
|
||||
end
|
||||
return Union{Missing, member_types...}
|
||||
end
|
||||
if !haskey(JULIA_TYPE_MAP, type_id)
|
||||
throw(NotImplementedException(string("Unsupported type for duckdb_type_to_julia_type: ", type_id)))
|
||||
end
|
||||
return JULIA_TYPE_MAP[type_id]
|
||||
end
|
||||
|
||||
const ROUNDING_EPOCH_TO_UNIX_EPOCH_DAYS = 719528
|
||||
const ROUNDING_EPOCH_TO_UNIX_EPOCH_MS = 62167219200000
|
||||
|
||||
sym(ptr) = ccall(:jl_symbol, Ref{Symbol}, (Ptr{UInt8},), ptr)
|
||||
sym(ptr::Cstring) = ccall(:jl_symbol, Ref{Symbol}, (Cstring,), ptr)
|
||||
|
||||
|
||||
# %% --- Older Types ------------------------------------------ #
|
||||
|
||||
struct duckdb_string
|
||||
data::Ptr{UInt8}
|
||||
length::idx_t
|
||||
|
||||
function duckdb_string(data, length)
|
||||
Base.depwarn("duckdb_string is deprecated, use duckdb_string_t instead", :deprecated)
|
||||
return new(data, length)
|
||||
end
|
||||
end
|
||||
|
||||
"""
|
||||
BLOBs are composed of a byte pointer and a size. You must free blob.data
|
||||
with `duckdb_free`.
|
||||
"""
|
||||
struct duckdb_blob
|
||||
data::Ref{UInt8}
|
||||
length::idx_t
|
||||
end
|
||||
|
||||
"""
|
||||
BITs are composed of a byte pointer and a size.
|
||||
BIT byte data has 0 to 7 bits of padding.
|
||||
The first byte contains the number of padding bits.
|
||||
This number of bits of the second byte are set to 1, starting from the MSB.
|
||||
You must free `data` with `duckdb_free`.
|
||||
"""
|
||||
struct duckdb_bit
|
||||
data::Ref{UInt8}
|
||||
size::idx_t
|
||||
end
|
||||
|
||||
Base.convert(::Type{duckdb_blob}, val::AbstractArray{UInt8}) = duckdb_blob(val, length(val))
|
||||
Base.convert(::Type{duckdb_blob}, val::AbstractString) = duckdb_blob(codeunits(val))
|
||||
# %% ----- Conversions ------------------------------
|
||||
|
||||
# HUGEINT / INT128
|
||||
# Fast Conversion without typechecking
|
||||
Base.convert(::Type{Int128}, val::duckdb_hugeint) = Int128(val.lower) + Int128(val.upper) << 64
|
||||
Base.convert(::Type{UInt128}, val::duckdb_uhugeint) = UInt128(val.lower) + UInt128(val.upper) << 64
|
||||
Base.cconvert(::Type{duckdb_hugeint}, x::Int128) =
|
||||
duckdb_hugeint((x & 0xFFFF_FFFF_FFFF_FFFF) % UInt64, (x >> 64) % Int64)
|
||||
Base.cconvert(::Type{duckdb_uhugeint}, v::UInt128) = duckdb_uhugeint(v % UInt64, (v >> 64) % UInt64)
|
||||
|
||||
# DATE & TIME Raw
|
||||
Base.convert(::Type{duckdb_date}, val::Integer) = duckdb_date(val)
|
||||
Base.convert(::Type{duckdb_time}, val::Integer) = duckdb_time(val)
|
||||
Base.convert(::Type{duckdb_timestamp}, val::Integer) = duckdb_timestamp(val)
|
||||
Base.convert(::Type{duckdb_timestamp_s}, val::Integer) = duckdb_timestamp_s(val)
|
||||
Base.convert(::Type{duckdb_timestamp_ms}, val::Integer) = duckdb_timestamp_ms(val)
|
||||
Base.convert(::Type{duckdb_timestamp_ns}, val::Integer) = duckdb_timestamp_ns(val)
|
||||
Base.convert(::Type{duckdb_time_tz}, val::Integer) = duckdb_time_tz(val)
|
||||
|
||||
Base.convert(::Type{<:Integer}, val::duckdb_date) = val.days
|
||||
Base.convert(::Type{<:Integer}, val::duckdb_time) = val.micros
|
||||
Base.convert(::Type{<:Integer}, val::duckdb_timestamp) = val.micros
|
||||
Base.convert(::Type{<:Integer}, val::duckdb_timestamp_s) = val.seconds
|
||||
Base.convert(::Type{<:Integer}, val::duckdb_timestamp_ms) = val.millis
|
||||
Base.convert(::Type{<:Integer}, val::duckdb_timestamp_ns) = val.nanos
|
||||
|
||||
function Base.convert(::Type{Date}, val::duckdb_date)
|
||||
return Dates.epochdays2date(val.days + ROUNDING_EPOCH_TO_UNIX_EPOCH_DAYS)
|
||||
end
|
||||
function Base.convert(::Type{duckdb_date}, val::Date)
|
||||
return duckdb_date(Dates.date2epochdays(val - ROUNDING_EPOCH_TO_UNIX_EPOCH_DAYS))
|
||||
end
|
||||
|
||||
function Base.convert(::Type{Time}, val::duckdb_time)
|
||||
return Dates.Time(
|
||||
val.micros ÷ 3_600_000_000,
|
||||
val.micros ÷ 60_000_000 % 60,
|
||||
val.micros ÷ 1_000_000 % 60,
|
||||
val.micros ÷ 1_000 % 1_000,
|
||||
val.micros % 1_000
|
||||
)
|
||||
end
|
||||
|
||||
function Base.convert(::Type{Time}, val::duckdb_time_tz)
|
||||
time_tz = duckdb_from_time_tz(val)
|
||||
# TODO: how to preserve the offset?
|
||||
return Dates.Time(
|
||||
time_tz.time.hour,
|
||||
time_tz.time.min,
|
||||
time_tz.time.sec,
|
||||
time_tz.time.micros ÷ 1000,
|
||||
time_tz.time.micros % 1000
|
||||
)
|
||||
end
|
||||
|
||||
Base.convert(::Type{Dates.DateTime}, val::duckdb_timestamp_s) =
|
||||
Dates.epochms2datetime((val.seconds * 1000) + ROUNDING_EPOCH_TO_UNIX_EPOCH_MS)
|
||||
Base.convert(::Type{Dates.DateTime}, val::duckdb_timestamp_ms) =
|
||||
Dates.epochms2datetime((val.millis) + ROUNDING_EPOCH_TO_UNIX_EPOCH_MS)
|
||||
Base.convert(::Type{Dates.DateTime}, val::duckdb_timestamp) =
|
||||
Dates.epochms2datetime((val.micros ÷ 1_000) + ROUNDING_EPOCH_TO_UNIX_EPOCH_MS)
|
||||
Base.convert(::Type{Dates.DateTime}, val::duckdb_timestamp_ns) =
|
||||
Dates.epochms2datetime((val.nanos ÷ 1_000_000) + ROUNDING_EPOCH_TO_UNIX_EPOCH_MS)
|
||||
|
||||
Base.convert(::Type{Dates.CompoundPeriod}, val::duckdb_interval) =
|
||||
Dates.CompoundPeriod(Dates.Month(val.months), Dates.Day(val.days), Dates.Microsecond(val.micros))
|
||||
|
||||
function Base.convert(::Type{UUID}, val::duckdb_hugeint)
|
||||
hugeint = convert(Int128, val)
|
||||
base_value = Int128(170141183460469231731687303715884105727)
|
||||
if hugeint < 0
|
||||
return UUID(UInt128(hugeint + base_value + 1))
|
||||
else
|
||||
return UUID(UInt128(hugeint) + base_value + 1)
|
||||
end
|
||||
end
|
||||
|
||||
# DECIMALS
|
||||
Base.convert(::Type{Float64}, val::duckdb_decimal) = duckdb_decimal_to_double(val)
|
||||
Base.convert(::Type{duckdb_decimal}, val::Float64) = duckdb_double_to_decimal(val)
|
||||
66
external/duckdb/tools/juliapkg/src/data_chunk.jl
vendored
Normal file
66
external/duckdb/tools/juliapkg/src/data_chunk.jl
vendored
Normal file
@@ -0,0 +1,66 @@
|
||||
"""
|
||||
DuckDB data chunk
|
||||
"""
|
||||
mutable struct DataChunk
|
||||
handle::duckdb_data_chunk
|
||||
|
||||
function DataChunk(handle::duckdb_data_chunk, destroy::Bool)
|
||||
result = new(handle)
|
||||
if destroy
|
||||
finalizer(_destroy_data_chunk, result)
|
||||
end
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function get_column_count(chunk::DataChunk)
|
||||
return duckdb_data_chunk_get_column_count(chunk.handle)
|
||||
end
|
||||
|
||||
function get_size(chunk::DataChunk)
|
||||
return duckdb_data_chunk_get_size(chunk.handle)
|
||||
end
|
||||
|
||||
function set_size(chunk::DataChunk, size::Int64)
|
||||
return duckdb_data_chunk_set_size(chunk.handle, size)
|
||||
end
|
||||
|
||||
function get_vector(chunk::DataChunk, col_idx::Int64)::Vec
|
||||
if col_idx < 1 || col_idx > get_column_count(chunk)
|
||||
throw(
|
||||
InvalidInputException(
|
||||
string(
|
||||
"get_array column index ",
|
||||
col_idx,
|
||||
" out of range, expected value between 1 and ",
|
||||
get_column_count(chunk)
|
||||
)
|
||||
)
|
||||
)
|
||||
end
|
||||
return Vec(duckdb_data_chunk_get_vector(chunk.handle, col_idx))
|
||||
end
|
||||
|
||||
function get_array(chunk::DataChunk, col_idx::Int64, ::Type{T})::Vector{T} where {T}
|
||||
return get_array(get_vector(chunk, col_idx), T)
|
||||
end
|
||||
|
||||
function get_validity(chunk::DataChunk, col_idx::Int64)::ValidityMask
|
||||
return get_validity(get_vector(chunk, col_idx))
|
||||
end
|
||||
|
||||
function all_valid(chunk::DataChunk, col_idx::Int64)
|
||||
return all_valid(get_vector(chunk, col_idx), get_size(chunk))
|
||||
end
|
||||
|
||||
# this is only required when we own the data chunk
|
||||
function _destroy_data_chunk(chunk::DataChunk)
|
||||
if chunk.handle != C_NULL
|
||||
duckdb_destroy_data_chunk(chunk.handle)
|
||||
end
|
||||
return chunk.handle = C_NULL
|
||||
end
|
||||
|
||||
function destroy_data_chunk(chunk::DataChunk)
|
||||
return _destroy_data_chunk(chunk)
|
||||
end
|
||||
122
external/duckdb/tools/juliapkg/src/database.jl
vendored
Normal file
122
external/duckdb/tools/juliapkg/src/database.jl
vendored
Normal file
@@ -0,0 +1,122 @@
|
||||
"""
|
||||
Internal DuckDB database handle.
|
||||
"""
|
||||
mutable struct DuckDBHandle
|
||||
file::String
|
||||
handle::duckdb_database
|
||||
functions::Vector{Any}
|
||||
scalar_functions::Dict{String, Any}
|
||||
registered_objects::Dict{Any, Any}
|
||||
|
||||
function DuckDBHandle(f::AbstractString, config::Config)
|
||||
f = String(isempty(f) ? f : expanduser(f))
|
||||
handle = Ref{duckdb_database}()
|
||||
error = Ref{Cstring}()
|
||||
if duckdb_open_ext(f, handle, config.handle, error) != DuckDBSuccess
|
||||
error_message = unsafe_string(error[])
|
||||
duckdb_free(pointer(error[]))
|
||||
throw(ConnectionException(error_message))
|
||||
end
|
||||
|
||||
db = new(f, handle[], Vector(), Dict(), Dict())
|
||||
finalizer(_close_database, db)
|
||||
return db
|
||||
end
|
||||
end
|
||||
|
||||
function _close_database(db::DuckDBHandle)
|
||||
# disconnect from DB
|
||||
if db.handle != C_NULL
|
||||
duckdb_close(db.handle)
|
||||
end
|
||||
return db.handle = C_NULL
|
||||
end
|
||||
|
||||
"""
|
||||
A connection object to a DuckDB database.
|
||||
|
||||
Transaction contexts are local to a single connection.
|
||||
|
||||
A connection can only run a single query concurrently.
|
||||
It is possible to open multiple connections to a single DuckDB database instance.
|
||||
Multiple connections can run multiple queries concurrently.
|
||||
"""
|
||||
mutable struct Connection <: DBInterface.Connection
|
||||
db::DuckDBHandle
|
||||
handle::duckdb_connection
|
||||
|
||||
function Connection(db::DuckDBHandle)
|
||||
handle = Ref{duckdb_connection}()
|
||||
if duckdb_connect(db.handle, handle) != DuckDBSuccess
|
||||
throw(ConnectionException("Failed to open connection"))
|
||||
end
|
||||
con = new(db, handle[])
|
||||
finalizer(_close_connection, con)
|
||||
return con
|
||||
end
|
||||
end
|
||||
|
||||
function _close_connection(con::Connection)
|
||||
# disconnect
|
||||
if con.handle != C_NULL
|
||||
duckdb_disconnect(con.handle)
|
||||
end
|
||||
con.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
"""
|
||||
A DuckDB database object.
|
||||
|
||||
By default a DuckDB database object has an open connection object (db.main_connection).
|
||||
When the database object is used directly in queries, it is actually the underlying main_connection that is used.
|
||||
|
||||
It is possible to open new connections to a single database instance using DBInterface.connect(db).
|
||||
"""
|
||||
mutable struct DB <: DBInterface.Connection
|
||||
handle::DuckDBHandle
|
||||
main_connection::Connection
|
||||
|
||||
function DB(f::AbstractString, config::Config)
|
||||
config["threads"] = string(Threads.nthreads())
|
||||
config["external_threads"] = string(Threads.nthreads()) # all threads are external
|
||||
handle = DuckDBHandle(f, config)
|
||||
main_connection = Connection(handle)
|
||||
|
||||
db = new(handle, main_connection)
|
||||
_add_table_scan(db)
|
||||
return db
|
||||
end
|
||||
|
||||
function DB(f::AbstractString; config = [], readonly = false)
|
||||
config = Config(config)
|
||||
if readonly
|
||||
config["access_mode"] = "READ_ONLY"
|
||||
end
|
||||
return DB(f, config)
|
||||
end
|
||||
end
|
||||
|
||||
function close_database(db::DB)
|
||||
_close_connection(db.main_connection)
|
||||
_close_database(db.handle)
|
||||
return
|
||||
end
|
||||
|
||||
const VECTOR_SIZE = duckdb_vector_size()
|
||||
const ROW_GROUP_SIZE = VECTOR_SIZE * 100
|
||||
|
||||
DB(; kwargs...) = DB(":memory:"; kwargs...)
|
||||
DBInterface.connect(::Type{DB}; kwargs...) = DB(; kwargs...)
|
||||
DBInterface.connect(::Type{DB}, f::AbstractString; kwargs...) = DB(f; kwargs...)
|
||||
DBInterface.connect(::Type{DB}, f::AbstractString, config::Config) = DB(f, config)
|
||||
DBInterface.connect(db::DB) = Connection(db.handle)
|
||||
DBInterface.close!(db::DB) = close_database(db)
|
||||
DBInterface.close!(con::Connection) = _close_connection(con)
|
||||
Base.close(db::DB) = close_database(db)
|
||||
Base.close(con::Connection) = _close_connection(con)
|
||||
Base.isopen(db::DB) = db.handle.handle != C_NULL
|
||||
Base.isopen(con::Connection) = con.handle != C_NULL
|
||||
|
||||
Base.show(io::IO, db::DuckDB.DB) = print(io, string("DuckDB.DB(", "\"$(db.handle.file)\"", ")"))
|
||||
Base.show(io::IO, con::DuckDB.Connection) = print(io, string("DuckDB.Connection(", "\"$(con.db.file)\"", ")"))
|
||||
5
external/duckdb/tools/juliapkg/src/ddl.jl
vendored
Normal file
5
external/duckdb/tools/juliapkg/src/ddl.jl
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
|
||||
function drop!(db::DB, table::AbstractString; ifexists::Bool = false)
|
||||
exists = ifexists ? "IF EXISTS" : ""
|
||||
return execute(db, "DROP TABLE $exists $(esc_id(table))")
|
||||
end
|
||||
17
external/duckdb/tools/juliapkg/src/exceptions.jl
vendored
Normal file
17
external/duckdb/tools/juliapkg/src/exceptions.jl
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
mutable struct ConnectionException <: Exception
|
||||
var::String
|
||||
end
|
||||
mutable struct QueryException <: Exception
|
||||
var::String
|
||||
end
|
||||
mutable struct NotImplementedException <: Exception
|
||||
var::String
|
||||
end
|
||||
mutable struct InvalidInputException <: Exception
|
||||
var::String
|
||||
end
|
||||
|
||||
Base.showerror(io::IO, e::ConnectionException) = print(io, e.var)
|
||||
Base.showerror(io::IO, e::QueryException) = print(io, e.var)
|
||||
Base.showerror(io::IO, e::NotImplementedException) = print(io, e.var)
|
||||
Base.showerror(io::IO, e::InvalidInputException) = print(io, e.var)
|
||||
5
external/duckdb/tools/juliapkg/src/helper.jl
vendored
Normal file
5
external/duckdb/tools/juliapkg/src/helper.jl
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
|
||||
function esc_id end
|
||||
|
||||
esc_id(x::AbstractString) = "\"" * replace(x, "\"" => "\"\"") * "\""
|
||||
esc_id(X::AbstractVector{S}) where {S <: AbstractString} = join(map(esc_id, X), ',')
|
||||
139
external/duckdb/tools/juliapkg/src/logical_type.jl
vendored
Normal file
139
external/duckdb/tools/juliapkg/src/logical_type.jl
vendored
Normal file
@@ -0,0 +1,139 @@
|
||||
"""
|
||||
DuckDB type
|
||||
"""
|
||||
mutable struct LogicalType
|
||||
handle::duckdb_logical_type
|
||||
|
||||
function LogicalType(type::DUCKDB_TYPE)
|
||||
handle = duckdb_create_logical_type(type)
|
||||
result = new(handle)
|
||||
finalizer(_destroy_type, result)
|
||||
return result
|
||||
end
|
||||
function LogicalType(handle::duckdb_logical_type)
|
||||
result = new(handle)
|
||||
finalizer(_destroy_type, result)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _destroy_type(type::LogicalType)
|
||||
if type.handle != C_NULL
|
||||
duckdb_destroy_logical_type(type.handle)
|
||||
end
|
||||
type.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
create_logical_type(::Type{T}) where {T <: String} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_VARCHAR)
|
||||
create_logical_type(::Type{T}) where {T <: Bool} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_BOOLEAN)
|
||||
create_logical_type(::Type{T}) where {T <: Int8} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_TINYINT)
|
||||
create_logical_type(::Type{T}) where {T <: Int16} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_SMALLINT)
|
||||
create_logical_type(::Type{T}) where {T <: Int32} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_INTEGER)
|
||||
create_logical_type(::Type{T}) where {T <: Int64} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_BIGINT)
|
||||
create_logical_type(::Type{T}) where {T <: Int128} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_HUGEINT)
|
||||
create_logical_type(::Type{T}) where {T <: UInt8} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_UTINYINT)
|
||||
create_logical_type(::Type{T}) where {T <: UInt16} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_USMALLINT)
|
||||
create_logical_type(::Type{T}) where {T <: UInt32} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_UINTEGER)
|
||||
create_logical_type(::Type{T}) where {T <: UInt64} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_UBIGINT)
|
||||
create_logical_type(::Type{T}) where {T <: UInt128} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_UHUGEINT)
|
||||
create_logical_type(::Type{T}) where {T <: Float32} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_FLOAT)
|
||||
create_logical_type(::Type{T}) where {T <: Float64} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_DOUBLE)
|
||||
create_logical_type(::Type{T}) where {T <: Date} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_DATE)
|
||||
create_logical_type(::Type{T}) where {T <: Time} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_TIME)
|
||||
create_logical_type(::Type{T}) where {T <: DateTime} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_TIMESTAMP)
|
||||
create_logical_type(::Type{T}) where {T <: AbstractString} = DuckDB.LogicalType(DuckDB.DUCKDB_TYPE_VARCHAR)
|
||||
function create_logical_type(::Type{T}) where {T <: FixedDecimal}
|
||||
int_type = T.parameters[1]
|
||||
width = 0
|
||||
scale = T.parameters[2]
|
||||
if int_type == Int16
|
||||
width = 4
|
||||
elseif int_type == Int32
|
||||
width = 9
|
||||
elseif int_type == Int64
|
||||
width = 18
|
||||
elseif int_type == Int128
|
||||
width = 38
|
||||
else
|
||||
throw(NotImplementedException("Unsupported internal type for decimal"))
|
||||
end
|
||||
return DuckDB.LogicalType(duckdb_create_decimal_type(width, scale))
|
||||
end
|
||||
|
||||
function create_logical_type(::Type{T}) where {T}
|
||||
throw(NotImplementedException("Unsupported type for create_logical_type"))
|
||||
end
|
||||
|
||||
function get_type_id(type::LogicalType)
|
||||
return duckdb_get_type_id(type.handle)
|
||||
end
|
||||
|
||||
function get_internal_type_id(type::LogicalType)
|
||||
type_id = get_type_id(type)
|
||||
if type_id == DUCKDB_TYPE_DECIMAL
|
||||
type_id = duckdb_decimal_internal_type(type.handle)
|
||||
elseif type_id == DUCKDB_TYPE_ENUM
|
||||
type_id = duckdb_enum_internal_type(type.handle)
|
||||
end
|
||||
return type_id
|
||||
end
|
||||
|
||||
function get_decimal_scale(type::LogicalType)
|
||||
return duckdb_decimal_scale(type.handle)
|
||||
end
|
||||
|
||||
function get_enum_dictionary(type::LogicalType)
|
||||
dict::Vector{String} = Vector{String}()
|
||||
dict_size = duckdb_enum_dictionary_size(type.handle)
|
||||
for i in 1:dict_size
|
||||
val = duckdb_enum_dictionary_value(type.handle, i)
|
||||
str_val = String(unsafe_string(val))
|
||||
push!(dict, str_val)
|
||||
duckdb_free(val)
|
||||
end
|
||||
return dict
|
||||
end
|
||||
|
||||
function get_list_child_type(type::LogicalType)
|
||||
return LogicalType(duckdb_list_type_child_type(type.handle))
|
||||
end
|
||||
|
||||
##===--------------------------------------------------------------------===##
|
||||
## Struct methods
|
||||
##===--------------------------------------------------------------------===##
|
||||
|
||||
function get_struct_child_count(type::LogicalType)
|
||||
return duckdb_struct_type_child_count(type.handle)
|
||||
end
|
||||
|
||||
|
||||
function get_struct_child_name(type::LogicalType, index::UInt64)
|
||||
val = duckdb_struct_type_child_name(type.handle, index)
|
||||
result = unsafe_string(val)
|
||||
duckdb_free(val)
|
||||
return result
|
||||
end
|
||||
|
||||
function get_struct_child_type(type::LogicalType, index::UInt64)
|
||||
return LogicalType(duckdb_struct_type_child_type(type.handle, index))
|
||||
end
|
||||
|
||||
##===--------------------------------------------------------------------===##
|
||||
## Union methods
|
||||
##===--------------------------------------------------------------------===##
|
||||
|
||||
function get_union_member_count(type::LogicalType)
|
||||
return duckdb_union_type_member_count(type.handle)
|
||||
end
|
||||
|
||||
function get_union_member_name(type::LogicalType, index::UInt64)
|
||||
val = duckdb_union_type_member_name(type.handle, index)
|
||||
result = unsafe_string(val)
|
||||
duckdb_free(val)
|
||||
return result
|
||||
end
|
||||
|
||||
function get_union_member_type(type::LogicalType, index::UInt64)
|
||||
return LogicalType(duckdb_union_type_member_type(type.handle, index))
|
||||
end
|
||||
33
external/duckdb/tools/juliapkg/src/old_interface.jl
vendored
Normal file
33
external/duckdb/tools/juliapkg/src/old_interface.jl
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
# old interface, deprecated
|
||||
open(dbpath::AbstractString) = DBInterface.connect(DuckDB.DB, dbpath)
|
||||
|
||||
connect(db::DB) = DBInterface.connect(db)
|
||||
|
||||
disconnect(con::Connection) = DBInterface.close!(con)
|
||||
close(db::DB) = DBInterface.close!(db)
|
||||
|
||||
# not really a dataframe anymore
|
||||
# if needed for backwards compatibility, can add through Requires/1.9 extension
|
||||
toDataFrame(r::QueryResult) = Tables.columntable(r)
|
||||
toDataFrame(con::Connection, sql::AbstractString) = toDataFrame(DBInterface.execute(con, sql))
|
||||
|
||||
function appendDataFrame(input_df, con::Connection, table::AbstractString, schema::String = "main")
|
||||
register_data_frame(con, input_df, "__append_df")
|
||||
DBInterface.execute(con, "INSERT INTO \"$schema\".\"$table\" SELECT * FROM __append_df")
|
||||
return unregister_data_frame(con, "__append_df")
|
||||
end
|
||||
|
||||
appendDataFrame(input_df, db::DB, table::AbstractString, schema::String = "main") =
|
||||
appendDataFrame(input_df, db.main_connection, table, schema)
|
||||
|
||||
"""
|
||||
DuckDB.load!(con, input_df, table)
|
||||
|
||||
Load an input DataFrame `input_df` into a new DuckDB table that will be named `table`.
|
||||
"""
|
||||
function load!(con, input_df, table::AbstractString, schema::String = "main")
|
||||
register_data_frame(con, input_df, "__append_df")
|
||||
DBInterface.execute(con, "CREATE TABLE \"$schema\".\"$table\" AS SELECT * FROM __append_df")
|
||||
unregister_data_frame(con, "__append_df")
|
||||
return
|
||||
end
|
||||
72
external/duckdb/tools/juliapkg/src/replacement_scan.jl
vendored
Normal file
72
external/duckdb/tools/juliapkg/src/replacement_scan.jl
vendored
Normal file
@@ -0,0 +1,72 @@
|
||||
|
||||
|
||||
mutable struct ReplacementFunction
|
||||
db::DB
|
||||
replacement_func::Function
|
||||
extra_data::Any
|
||||
uuid::UUID
|
||||
end
|
||||
|
||||
struct ReplacementFunctionInfo
|
||||
handle::duckdb_replacement_scan_info
|
||||
main_function::ReplacementFunction
|
||||
table_name::String
|
||||
|
||||
function ReplacementFunctionInfo(
|
||||
handle::duckdb_replacement_scan_info,
|
||||
main_function::ReplacementFunction,
|
||||
table_name::String
|
||||
)
|
||||
result = new(handle, main_function, table_name)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _replacement_scan_function(handle::duckdb_replacement_scan_info, table_name::Ptr{UInt8}, data::Ptr{Cvoid})
|
||||
try
|
||||
func::ReplacementFunction = unsafe_pointer_to_objref(data)
|
||||
tname = unsafe_string(table_name)
|
||||
info = ReplacementFunctionInfo(handle, func, tname)
|
||||
func.replacement_func(info)
|
||||
catch
|
||||
duckdb_replacement_scan_set_error(handle, get_exception_info())
|
||||
return
|
||||
end
|
||||
end
|
||||
|
||||
function getdb(info::ReplacementFunctionInfo)
|
||||
return info.main_function.db
|
||||
end
|
||||
|
||||
function get_extra_data(info::ReplacementFunctionInfo)
|
||||
return info.main_function.extra_data
|
||||
end
|
||||
|
||||
function get_table_name(info::ReplacementFunctionInfo)
|
||||
return info.table_name
|
||||
end
|
||||
|
||||
function set_function_name(info::ReplacementFunctionInfo, function_name::String)
|
||||
return duckdb_replacement_scan_set_function_name(info.handle, function_name)
|
||||
end
|
||||
|
||||
function add_function_parameter(info::ReplacementFunctionInfo, parameter::Value)
|
||||
return duckdb_replacement_scan_add_parameter(info.handle, parameter.handle)
|
||||
end
|
||||
|
||||
function _replacement_func_cleanup(data::Ptr{Cvoid})
|
||||
info::ReplacementFunction = unsafe_pointer_to_objref(data)
|
||||
delete!(info.db.handle.registered_objects, info.uuid)
|
||||
return
|
||||
end
|
||||
|
||||
function add_replacement_scan!(db::DB, replacement_func::Function, extra_data::Any)
|
||||
func = ReplacementFunction(db, replacement_func, extra_data, uuid4())
|
||||
db.handle.registered_objects[func.uuid] = func
|
||||
return duckdb_add_replacement_scan(
|
||||
db.handle.handle,
|
||||
@cfunction(_replacement_scan_function, Cvoid, (duckdb_replacement_scan_info, Ptr{UInt8}, Ptr{Cvoid})),
|
||||
pointer_from_objref(func),
|
||||
@cfunction(_replacement_func_cleanup, Cvoid, (Ptr{Cvoid},))
|
||||
)
|
||||
end
|
||||
903
external/duckdb/tools/juliapkg/src/result.jl
vendored
Normal file
903
external/duckdb/tools/juliapkg/src/result.jl
vendored
Normal file
@@ -0,0 +1,903 @@
|
||||
import Base.Threads.@spawn
|
||||
|
||||
mutable struct QueryResult
|
||||
handle::Ref{duckdb_result}
|
||||
names::Vector{Symbol}
|
||||
types::Vector{Type}
|
||||
tbl::Union{Missing, NamedTuple}
|
||||
chunk_index::UInt64
|
||||
|
||||
function QueryResult(handle::Ref{duckdb_result})
|
||||
column_count = duckdb_column_count(handle)
|
||||
names::Vector{Symbol} = Vector()
|
||||
for i in 1:column_count
|
||||
name = sym(duckdb_column_name(handle, i))
|
||||
if name in view(names, 1:(i - 1))
|
||||
j = 1
|
||||
new_name = Symbol(name, :_, j)
|
||||
while new_name in view(names, 1:(i - 1))
|
||||
j += 1
|
||||
new_name = Symbol(name, :_, j)
|
||||
end
|
||||
name = new_name
|
||||
end
|
||||
push!(names, name)
|
||||
end
|
||||
types::Vector{Type} = Vector()
|
||||
for i in 1:column_count
|
||||
logical_type = LogicalType(duckdb_column_logical_type(handle, i))
|
||||
push!(types, Union{Missing, duckdb_type_to_julia_type(logical_type)})
|
||||
end
|
||||
|
||||
result = new(handle, names, types, missing, 1)
|
||||
finalizer(_close_result, result)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _close_result(result::QueryResult)
|
||||
duckdb_destroy_result(result.handle)
|
||||
return
|
||||
end
|
||||
|
||||
const DataChunks = Union{Vector{DataChunk}, Tuple{DataChunk}}
|
||||
|
||||
mutable struct ColumnConversionData{ChunksT <: DataChunks}
|
||||
chunks::ChunksT
|
||||
col_idx::Int64
|
||||
logical_type::LogicalType
|
||||
conversion_data::Any
|
||||
end
|
||||
|
||||
mutable struct ListConversionData
|
||||
conversion_func::Function
|
||||
conversion_loop_func::Function
|
||||
child_type::LogicalType
|
||||
internal_type::Type
|
||||
target_type::Type
|
||||
child_conversion_data::Any
|
||||
end
|
||||
|
||||
mutable struct StructConversionData
|
||||
tuple_type::Any
|
||||
child_conversion_data::Vector{ListConversionData}
|
||||
end
|
||||
|
||||
nop_convert(column_data::ColumnConversionData, val) = val
|
||||
|
||||
function convert_string(column_data::ColumnConversionData, val::Ptr{Cvoid}, idx::UInt64)
|
||||
base_ptr = val + (idx - 1) * sizeof(duckdb_string_t)
|
||||
length_ptr = Base.unsafe_convert(Ptr{Int32}, base_ptr)
|
||||
length = unsafe_load(length_ptr)
|
||||
if length <= STRING_INLINE_LENGTH
|
||||
prefix_ptr = Base.unsafe_convert(Ptr{UInt8}, base_ptr + sizeof(Int32))
|
||||
return unsafe_string(prefix_ptr, length)
|
||||
else
|
||||
ptr_ptr = Base.unsafe_convert(Ptr{Ptr{UInt8}}, base_ptr + sizeof(Int32) * 2)
|
||||
data_ptr = Base.unsafe_load(ptr_ptr)
|
||||
return unsafe_string(data_ptr, length)
|
||||
end
|
||||
end
|
||||
|
||||
function convert_blob(column_data::ColumnConversionData, val::Ptr{Cvoid}, idx::UInt64)::Base.CodeUnits{UInt8, String}
|
||||
return Base.codeunits(convert_string(column_data, val, idx))
|
||||
end
|
||||
|
||||
convert_date(column_data::ColumnConversionData, val) = convert(Date, val)
|
||||
convert_time(column_data::ColumnConversionData, val) = convert(Time, val)
|
||||
convert_time_tz(column_data::ColumnConversionData, val) = convert(Time, convert(duckdb_time_tz, val))
|
||||
convert_timestamp(column_data::ColumnConversionData, val) = convert(DateTime, convert(duckdb_timestamp, val))
|
||||
convert_timestamp_s(column_data::ColumnConversionData, val) = convert(DateTime, convert(duckdb_timestamp_s, val))
|
||||
convert_timestamp_ms(column_data::ColumnConversionData, val) = convert(DateTime, convert(duckdb_timestamp_ms, val))
|
||||
convert_timestamp_ns(column_data::ColumnConversionData, val) = convert(DateTime, convert(duckdb_timestamp_ns, val))
|
||||
convert_interval(column_data::ColumnConversionData, val::duckdb_interval) = convert(Dates.CompoundPeriod, val)
|
||||
convert_hugeint(column_data::ColumnConversionData, val::duckdb_hugeint) = convert(Int128, val)
|
||||
convert_uhugeint(column_data::ColumnConversionData, val::duckdb_uhugeint) = convert(UInt128, val)
|
||||
convert_uuid(column_data::ColumnConversionData, val::duckdb_hugeint) = convert(UUID, val)
|
||||
|
||||
|
||||
function convert_enum(column_data::ColumnConversionData, val)::String
|
||||
return column_data.conversion_data[val + 1]
|
||||
end
|
||||
|
||||
function convert_decimal_hugeint(column_data::ColumnConversionData, val::duckdb_hugeint)
|
||||
return Base.reinterpret(column_data.conversion_data, convert_hugeint(column_data, val))
|
||||
end
|
||||
|
||||
function convert_decimal(column_data::ColumnConversionData, val)
|
||||
return Base.reinterpret(column_data.conversion_data, val)
|
||||
end
|
||||
|
||||
function convert_vector(
|
||||
column_data::ColumnConversionData,
|
||||
vector::Vec,
|
||||
size::UInt64,
|
||||
convert_func::Function,
|
||||
result,
|
||||
position,
|
||||
all_valid,
|
||||
::Type{SRC},
|
||||
::Type{DST}
|
||||
) where {SRC, DST}
|
||||
array = get_array(vector, SRC, size)
|
||||
if !all_valid
|
||||
validity = get_validity(vector, size)
|
||||
end
|
||||
for i in 1:size
|
||||
if all_valid || isvalid(validity, i)
|
||||
result[position] = convert_func(column_data, array[i])
|
||||
end
|
||||
position += 1
|
||||
end
|
||||
return size
|
||||
end
|
||||
|
||||
function convert_vector_string(
|
||||
column_data::ColumnConversionData,
|
||||
vector::Vec,
|
||||
size::UInt64,
|
||||
convert_func::Function,
|
||||
result,
|
||||
position,
|
||||
all_valid,
|
||||
::Type{SRC},
|
||||
::Type{DST}
|
||||
) where {SRC, DST}
|
||||
raw_ptr = duckdb_vector_get_data(vector.handle)
|
||||
ptr = Base.unsafe_convert(Ptr{duckdb_string_t}, raw_ptr)
|
||||
if !all_valid
|
||||
validity = get_validity(vector, size)
|
||||
end
|
||||
for i in 1:size
|
||||
if all_valid || isvalid(validity, i)
|
||||
result[position] = convert_func(column_data, raw_ptr, i)
|
||||
end
|
||||
position += 1
|
||||
end
|
||||
return size
|
||||
end
|
||||
|
||||
function convert_vector_list(
|
||||
column_data::ColumnConversionData,
|
||||
vector::Vec,
|
||||
size::UInt64,
|
||||
convert_func::Function,
|
||||
result,
|
||||
position,
|
||||
all_valid,
|
||||
::Type{SRC},
|
||||
::Type{DST}
|
||||
) where {SRC, DST}
|
||||
child_vector = list_child(vector)
|
||||
lsize = list_size(vector)
|
||||
|
||||
# convert the child vector
|
||||
ldata = column_data.conversion_data
|
||||
|
||||
child_column_data =
|
||||
ColumnConversionData(column_data.chunks, column_data.col_idx, ldata.child_type, ldata.child_conversion_data)
|
||||
child_array = Array{Union{Missing, ldata.target_type}}(missing, lsize)
|
||||
ldata.conversion_loop_func(
|
||||
child_column_data,
|
||||
child_vector,
|
||||
lsize,
|
||||
ldata.conversion_func,
|
||||
child_array,
|
||||
1,
|
||||
false,
|
||||
ldata.internal_type,
|
||||
ldata.target_type
|
||||
)
|
||||
|
||||
array = get_array(vector, SRC, size)
|
||||
if !all_valid
|
||||
validity = get_validity(vector, size)
|
||||
end
|
||||
for i in 1:size
|
||||
if all_valid || isvalid(validity, i)
|
||||
start_offset::UInt64 = array[i].offset + 1
|
||||
end_offset::UInt64 = array[i].offset + array[i].length
|
||||
result[position] = child_array[start_offset:end_offset]
|
||||
end
|
||||
position += 1
|
||||
end
|
||||
return size
|
||||
end
|
||||
|
||||
function convert_struct_children(column_data::ColumnConversionData, vector::Vec, size::UInt64)
|
||||
# convert the child vectors of the struct
|
||||
child_count = get_struct_child_count(column_data.logical_type)
|
||||
child_arrays = Vector()
|
||||
for i in 1:child_count
|
||||
child_vector = struct_child(vector, i)
|
||||
ldata = column_data.conversion_data.child_conversion_data[i]
|
||||
|
||||
child_column_data =
|
||||
ColumnConversionData(column_data.chunks, column_data.col_idx, ldata.child_type, ldata.child_conversion_data)
|
||||
child_array = Array{Union{Missing, ldata.target_type}}(missing, size)
|
||||
ldata.conversion_loop_func(
|
||||
child_column_data,
|
||||
child_vector,
|
||||
size,
|
||||
ldata.conversion_func,
|
||||
child_array,
|
||||
1,
|
||||
false,
|
||||
ldata.internal_type,
|
||||
ldata.target_type
|
||||
)
|
||||
push!(child_arrays, child_array)
|
||||
end
|
||||
return child_arrays
|
||||
end
|
||||
|
||||
|
||||
function convert_vector_struct(
|
||||
column_data::ColumnConversionData,
|
||||
vector::Vec,
|
||||
size::UInt64,
|
||||
convert_func::Function,
|
||||
result,
|
||||
position,
|
||||
all_valid,
|
||||
::Type{SRC},
|
||||
::Type{DST}
|
||||
) where {SRC, DST}
|
||||
child_count = get_struct_child_count(column_data.logical_type)
|
||||
child_arrays = convert_struct_children(column_data, vector, size)
|
||||
|
||||
if !all_valid
|
||||
validity = get_validity(vector, size)
|
||||
end
|
||||
for i in 1:size
|
||||
if all_valid || isvalid(validity, i)
|
||||
result_tuple = Vector()
|
||||
for child_idx in 1:child_count
|
||||
push!(result_tuple, child_arrays[child_idx][i])
|
||||
end
|
||||
result[position] = NamedTuple{column_data.conversion_data.tuple_type}(result_tuple)
|
||||
end
|
||||
position += 1
|
||||
end
|
||||
return size
|
||||
end
|
||||
|
||||
function convert_vector_union(
|
||||
column_data::ColumnConversionData,
|
||||
vector::Vec,
|
||||
size::UInt64,
|
||||
convert_func::Function,
|
||||
result,
|
||||
position,
|
||||
all_valid,
|
||||
::Type{SRC},
|
||||
::Type{DST}
|
||||
) where {SRC, DST}
|
||||
child_arrays = convert_struct_children(column_data, vector, size)
|
||||
|
||||
if !all_valid
|
||||
validity = get_validity(vector, size)
|
||||
end
|
||||
for row in 1:size
|
||||
# For every row/record
|
||||
if all_valid || isvalid(validity, row)
|
||||
# Get the tag of this row
|
||||
tag::UInt64 = child_arrays[1][row]
|
||||
type::DataType = duckdb_type_to_julia_type(get_union_member_type(column_data.logical_type, tag + 1))
|
||||
# Get the value from the child array indicated by the tag
|
||||
# Offset by 1 because of julia
|
||||
# Offset by another 1 because of the tag vector
|
||||
value = child_arrays[tag + 2][row]
|
||||
result[position] = isequal(value, missing) ? missing : type(value)
|
||||
end
|
||||
position += 1
|
||||
end
|
||||
return size
|
||||
end
|
||||
|
||||
function convert_vector_map(
|
||||
column_data::ColumnConversionData,
|
||||
vector::Vec,
|
||||
size::UInt64,
|
||||
convert_func::Function,
|
||||
result,
|
||||
position,
|
||||
all_valid,
|
||||
::Type{SRC},
|
||||
::Type{DST}
|
||||
) where {SRC, DST}
|
||||
child_vector = list_child(vector)
|
||||
lsize = list_size(vector)
|
||||
|
||||
# convert the child vector
|
||||
ldata = column_data.conversion_data
|
||||
|
||||
child_column_data =
|
||||
ColumnConversionData(column_data.chunks, column_data.col_idx, ldata.child_type, ldata.child_conversion_data)
|
||||
child_array = Array{Union{Missing, ldata.target_type}}(missing, lsize)
|
||||
ldata.conversion_loop_func(
|
||||
child_column_data,
|
||||
child_vector,
|
||||
lsize,
|
||||
ldata.conversion_func,
|
||||
child_array,
|
||||
1,
|
||||
false,
|
||||
ldata.internal_type,
|
||||
ldata.target_type
|
||||
)
|
||||
child_arrays = convert_struct_children(child_column_data, child_vector, lsize)
|
||||
keys = child_arrays[1]
|
||||
values = child_arrays[2]
|
||||
|
||||
array = get_array(vector, SRC, size)
|
||||
if !all_valid
|
||||
validity = get_validity(vector, size)
|
||||
end
|
||||
for i in 1:size
|
||||
if all_valid || isvalid(validity, i)
|
||||
result_dict = Dict()
|
||||
start_offset::UInt64 = array[i].offset + 1
|
||||
end_offset::UInt64 = array[i].offset + array[i].length
|
||||
for key_idx in start_offset:end_offset
|
||||
result_dict[keys[key_idx]] = values[key_idx]
|
||||
end
|
||||
result[position] = result_dict
|
||||
end
|
||||
position += 1
|
||||
end
|
||||
return size
|
||||
end
|
||||
|
||||
function convert_column_loop(
|
||||
column_data::ColumnConversionData,
|
||||
convert_func::Function,
|
||||
::Type{SRC},
|
||||
::Type{DST},
|
||||
convert_vector_func::Function
|
||||
) where {SRC, DST}
|
||||
# first check if there are null values in any chunks
|
||||
has_missing = false
|
||||
row_count = 0
|
||||
for chunk in column_data.chunks
|
||||
if !all_valid(chunk, column_data.col_idx)
|
||||
has_missing = true
|
||||
end
|
||||
row_count += get_size(chunk)
|
||||
end
|
||||
if has_missing
|
||||
# missing values
|
||||
result = Array{Union{Missing, DST}}(missing, row_count)
|
||||
position = 1
|
||||
for chunk in column_data.chunks
|
||||
position += convert_vector_func(
|
||||
column_data,
|
||||
get_vector(chunk, column_data.col_idx),
|
||||
get_size(chunk),
|
||||
convert_func,
|
||||
result,
|
||||
position,
|
||||
all_valid(chunk, column_data.col_idx),
|
||||
SRC,
|
||||
DST
|
||||
)
|
||||
end
|
||||
else
|
||||
# no missing values
|
||||
result = Array{DST}(undef, row_count)
|
||||
position = 1
|
||||
for chunk in column_data.chunks
|
||||
position += convert_vector_func(
|
||||
column_data,
|
||||
get_vector(chunk, column_data.col_idx),
|
||||
get_size(chunk),
|
||||
convert_func,
|
||||
result,
|
||||
position,
|
||||
true,
|
||||
SRC,
|
||||
DST
|
||||
)
|
||||
end
|
||||
end
|
||||
return result
|
||||
end
|
||||
|
||||
function create_child_conversion_data(child_type::LogicalType)
|
||||
internal_type_id = get_internal_type_id(child_type)
|
||||
internal_type = duckdb_type_to_internal_type(internal_type_id)
|
||||
target_type = duckdb_type_to_julia_type(child_type)
|
||||
|
||||
conversion_func = get_conversion_function(child_type)
|
||||
conversion_loop_func = get_conversion_loop_function(child_type)
|
||||
child_conversion_data = init_conversion_loop(child_type)
|
||||
return ListConversionData(
|
||||
conversion_func,
|
||||
conversion_loop_func,
|
||||
child_type,
|
||||
internal_type,
|
||||
target_type,
|
||||
child_conversion_data
|
||||
)
|
||||
end
|
||||
|
||||
function init_conversion_loop(logical_type::LogicalType)
|
||||
type = get_type_id(logical_type)
|
||||
if type == DUCKDB_TYPE_DECIMAL
|
||||
return duckdb_type_to_julia_type(logical_type)
|
||||
elseif type == DUCKDB_TYPE_ENUM
|
||||
return get_enum_dictionary(logical_type)
|
||||
elseif type == DUCKDB_TYPE_LIST || type == DUCKDB_TYPE_MAP
|
||||
child_type = get_list_child_type(logical_type)
|
||||
return create_child_conversion_data(child_type)
|
||||
elseif type == DUCKDB_TYPE_STRUCT || type == DUCKDB_TYPE_UNION
|
||||
child_count_fun::Function = get_struct_child_count
|
||||
child_type_fun::Function = get_struct_child_type
|
||||
child_name_fun::Function = get_struct_child_name
|
||||
|
||||
#if type == DUCKDB_TYPE_UNION
|
||||
# child_count_fun = get_union_member_count
|
||||
# child_type_fun = get_union_member_type
|
||||
# child_name_fun = get_union_member_name
|
||||
#end
|
||||
|
||||
child_count = child_count_fun(logical_type)
|
||||
child_symbols::Vector{Symbol} = Vector()
|
||||
child_data::Vector{ListConversionData} = Vector()
|
||||
for i in 1:child_count
|
||||
child_symbol = Symbol(child_name_fun(logical_type, i))
|
||||
child_type = child_type_fun(logical_type, i)
|
||||
child_conv_data = create_child_conversion_data(child_type)
|
||||
push!(child_symbols, child_symbol)
|
||||
push!(child_data, child_conv_data)
|
||||
end
|
||||
return StructConversionData(Tuple(x for x in child_symbols), child_data)
|
||||
else
|
||||
return nothing
|
||||
end
|
||||
end
|
||||
|
||||
function get_conversion_function(logical_type::LogicalType)::Function
|
||||
type = get_type_id(logical_type)
|
||||
if type == DUCKDB_TYPE_VARCHAR
|
||||
return convert_string
|
||||
elseif type == DUCKDB_TYPE_BLOB || type == DUCKDB_TYPE_BIT
|
||||
return convert_blob
|
||||
elseif type == DUCKDB_TYPE_DATE
|
||||
return convert_date
|
||||
elseif type == DUCKDB_TYPE_TIME
|
||||
return convert_time
|
||||
elseif type == DUCKDB_TYPE_TIME_TZ
|
||||
return convert_time_tz
|
||||
elseif type == DUCKDB_TYPE_TIMESTAMP || type == DUCKDB_TYPE_TIMESTAMP_TZ
|
||||
return convert_timestamp
|
||||
elseif type == DUCKDB_TYPE_TIMESTAMP_S
|
||||
return convert_timestamp_s
|
||||
elseif type == DUCKDB_TYPE_TIMESTAMP_MS
|
||||
return convert_timestamp_ms
|
||||
elseif type == DUCKDB_TYPE_TIMESTAMP_NS
|
||||
return convert_timestamp_ns
|
||||
elseif type == DUCKDB_TYPE_INTERVAL
|
||||
return convert_interval
|
||||
elseif type == DUCKDB_TYPE_HUGEINT
|
||||
return convert_hugeint
|
||||
elseif type == DUCKDB_TYPE_UHUGEINT
|
||||
return convert_uhugeint
|
||||
elseif type == DUCKDB_TYPE_UUID
|
||||
return convert_uuid
|
||||
elseif type == DUCKDB_TYPE_DECIMAL
|
||||
internal_type_id = get_internal_type_id(logical_type)
|
||||
if internal_type_id == DUCKDB_TYPE_HUGEINT
|
||||
return convert_decimal_hugeint
|
||||
else
|
||||
return convert_decimal
|
||||
end
|
||||
elseif type == DUCKDB_TYPE_ENUM
|
||||
return convert_enum
|
||||
else
|
||||
return nop_convert
|
||||
end
|
||||
end
|
||||
|
||||
function get_conversion_loop_function(logical_type::LogicalType)::Function
|
||||
type = get_type_id(logical_type)
|
||||
if type == DUCKDB_TYPE_VARCHAR || type == DUCKDB_TYPE_BLOB || type == DUCKDB_TYPE_BIT
|
||||
return convert_vector_string
|
||||
elseif type == DUCKDB_TYPE_LIST
|
||||
return convert_vector_list
|
||||
elseif type == DUCKDB_TYPE_STRUCT
|
||||
return convert_vector_struct
|
||||
elseif type == DUCKDB_TYPE_MAP
|
||||
return convert_vector_map
|
||||
elseif type == DUCKDB_TYPE_UNION
|
||||
return convert_vector_union
|
||||
else
|
||||
return convert_vector
|
||||
end
|
||||
end
|
||||
|
||||
function convert_column(column_data::ColumnConversionData)
|
||||
internal_type_id = get_internal_type_id(column_data.logical_type)
|
||||
internal_type = duckdb_type_to_internal_type(internal_type_id)
|
||||
target_type = duckdb_type_to_julia_type(column_data.logical_type)
|
||||
|
||||
conversion_func = get_conversion_function(column_data.logical_type)
|
||||
conversion_loop_func = get_conversion_loop_function(column_data.logical_type)
|
||||
|
||||
column_data.conversion_data = init_conversion_loop(column_data.logical_type)
|
||||
return convert_column_loop(column_data, conversion_func, internal_type, target_type, conversion_loop_func)
|
||||
end
|
||||
|
||||
function convert_columns(q::QueryResult, chunks::DataChunks, column_count::Integer = duckdb_column_count(q.handle))
|
||||
return NamedTuple{Tuple(q.names)}(ntuple(column_count) do i
|
||||
j = Int64(i)
|
||||
logical_type = LogicalType(duckdb_column_logical_type(q.handle, j))
|
||||
column_data = ColumnConversionData(chunks, j, logical_type, nothing)
|
||||
return convert_column(column_data)
|
||||
end)
|
||||
end
|
||||
|
||||
function Tables.columns(q::QueryResult)
|
||||
if q.tbl === missing
|
||||
if q.chunk_index != 1
|
||||
throw(
|
||||
NotImplementedException(
|
||||
"Materializing into a Julia table is not supported after calling nextDataChunk"
|
||||
)
|
||||
)
|
||||
end
|
||||
# gather all the data chunks
|
||||
chunks::Vector{DataChunk} = []
|
||||
while true
|
||||
# fetch the next chunk
|
||||
chunk = DuckDB.nextDataChunk(q)
|
||||
if chunk === missing
|
||||
# consumed all chunks
|
||||
break
|
||||
end
|
||||
push!(chunks, chunk)
|
||||
end
|
||||
|
||||
q.tbl = convert_columns(q, chunks)
|
||||
end
|
||||
return Tables.CopiedColumns(q.tbl)
|
||||
end
|
||||
|
||||
mutable struct PendingQueryResult
|
||||
handle::duckdb_pending_result
|
||||
success::Bool
|
||||
|
||||
function PendingQueryResult(stmt::Stmt)
|
||||
pending_handle = Ref{duckdb_pending_result}()
|
||||
ret = executePending(stmt.handle, pending_handle, stmt.result_type)
|
||||
result = new(pending_handle[], ret == DuckDBSuccess)
|
||||
finalizer(_close_pending_result, result)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function executePending(
|
||||
handle::duckdb_prepared_statement,
|
||||
pending_handle::Ref{duckdb_pending_result},
|
||||
::Type{MaterializedResult}
|
||||
)
|
||||
return duckdb_pending_prepared(handle, pending_handle)
|
||||
end
|
||||
|
||||
function executePending(
|
||||
handle::duckdb_prepared_statement,
|
||||
pending_handle::Ref{duckdb_pending_result},
|
||||
::Type{StreamResult}
|
||||
)
|
||||
return duckdb_pending_prepared_streaming(handle, pending_handle)
|
||||
end
|
||||
|
||||
function _close_pending_result(pending::PendingQueryResult)
|
||||
if pending.handle == C_NULL
|
||||
return
|
||||
end
|
||||
duckdb_destroy_pending(pending.handle)
|
||||
pending.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
function fetch_error(sql::AbstractString, error_ptr)
|
||||
if error_ptr == C_NULL
|
||||
return string("Execute of query \"", sql, "\" failed: unknown error")
|
||||
else
|
||||
return string("Execute of query \"", sql, "\" failed: ", unsafe_string(error_ptr))
|
||||
end
|
||||
end
|
||||
|
||||
function get_error(stmt::Stmt, pending::PendingQueryResult)
|
||||
error_ptr = duckdb_pending_error(pending.handle)
|
||||
error_message = fetch_error(stmt.sql, error_ptr)
|
||||
_close_pending_result(pending)
|
||||
return error_message
|
||||
end
|
||||
|
||||
# execute tasks from a pending query result in a loop
|
||||
function pending_execute_tasks(pending::PendingQueryResult)::Bool
|
||||
ret = DUCKDB_PENDING_RESULT_NOT_READY
|
||||
while !duckdb_pending_execution_is_finished(ret)
|
||||
GC.safepoint()
|
||||
ret = duckdb_pending_execute_task(pending.handle)
|
||||
end
|
||||
return ret != DUCKDB_PENDING_ERROR
|
||||
end
|
||||
|
||||
function pending_execute_check_state(pending::PendingQueryResult)::duckdb_pending_state
|
||||
ret = duckdb_pending_execute_check_state(pending.handle)
|
||||
return ret
|
||||
end
|
||||
|
||||
# execute background tasks in a loop, until task execution is finished
|
||||
function execute_tasks(state::duckdb_task_state, con::Connection)
|
||||
while !duckdb_task_state_is_finished(state)
|
||||
duckdb_execute_n_tasks_state(state, 1)
|
||||
GC.safepoint()
|
||||
Base.yield()
|
||||
if duckdb_execution_is_finished(con.handle)
|
||||
break
|
||||
end
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
# cleanup background tasks
|
||||
function cleanup_tasks(tasks, state)
|
||||
# mark execution as finished so the individual tasks will quit
|
||||
duckdb_finish_execution(state)
|
||||
# now wait for all tasks to finish executing
|
||||
exceptions = []
|
||||
for task in tasks
|
||||
try
|
||||
Base.wait(task)
|
||||
catch ex
|
||||
push!(exceptions, ex)
|
||||
end
|
||||
end
|
||||
# clean up the tasks and task state
|
||||
empty!(tasks)
|
||||
duckdb_destroy_task_state(state)
|
||||
|
||||
# if any tasks threw, propagate the error upwards by throwing as well
|
||||
for ex in exceptions
|
||||
throw(ex)
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
function execute_singlethreaded(pending::PendingQueryResult)::Bool
|
||||
# Only when there are no additional threads, use the main thread to execute
|
||||
success = true
|
||||
try
|
||||
# now start executing tasks of the pending result in a loop
|
||||
success = pending_execute_tasks(pending)
|
||||
catch ex
|
||||
throw(ex)
|
||||
end
|
||||
return success
|
||||
end
|
||||
|
||||
function execute_multithreaded(stmt::Stmt, pending::PendingQueryResult)
|
||||
# if multi-threading is enabled, launch background tasks
|
||||
task_state = duckdb_create_task_state(stmt.con.db.handle)
|
||||
|
||||
tasks = []
|
||||
for _ in 1:Threads.nthreads()
|
||||
task_val = @spawn execute_tasks(task_state, stmt.con)
|
||||
push!(tasks, task_val)
|
||||
end
|
||||
|
||||
# When we have additional worker threads, don't execute using the main thread
|
||||
while duckdb_execution_is_finished(stmt.con.handle) == false
|
||||
ret = pending_execute_check_state(pending)
|
||||
if ret == DUCKDB_PENDING_RESULT_READY || ret == DUCKDB_PENDING_ERROR
|
||||
break
|
||||
end
|
||||
Base.yield()
|
||||
GC.safepoint()
|
||||
end
|
||||
|
||||
# we finished execution of all tasks, cleanup the tasks
|
||||
return cleanup_tasks(tasks, task_state)
|
||||
end
|
||||
|
||||
# this function is responsible for executing a statement and returning a result
|
||||
function execute(stmt::Stmt, params::DBInterface.StatementParams = ())
|
||||
bind_parameters(stmt, params)
|
||||
|
||||
# first create a pending query result
|
||||
pending = PendingQueryResult(stmt)
|
||||
if !pending.success
|
||||
throw(QueryException(get_error(stmt, pending)))
|
||||
end
|
||||
|
||||
success = true
|
||||
if Threads.nthreads() == 1
|
||||
success = execute_singlethreaded(pending)
|
||||
# check if an error was thrown
|
||||
if !success
|
||||
throw(QueryException(get_error(stmt, pending)))
|
||||
end
|
||||
else
|
||||
execute_multithreaded(stmt, pending)
|
||||
end
|
||||
|
||||
handle = Ref{duckdb_result}()
|
||||
ret = duckdb_execute_pending(pending.handle, handle)
|
||||
if ret != DuckDBSuccess
|
||||
error_ptr = duckdb_result_error(handle)
|
||||
error_message = fetch_error(stmt.sql, error_ptr)
|
||||
duckdb_destroy_result(handle)
|
||||
throw(QueryException(error_message))
|
||||
end
|
||||
return QueryResult(handle)
|
||||
end
|
||||
|
||||
# explicitly close prepared statement
|
||||
DBInterface.close!(stmt::Stmt) = _close_stmt(stmt)
|
||||
|
||||
function execute(con::Connection, sql::AbstractString, params::DBInterface.StatementParams)
|
||||
stmt = Stmt(con, sql, MaterializedResult)
|
||||
try
|
||||
return execute(stmt, params)
|
||||
finally
|
||||
_close_stmt(stmt) # immediately close, don't wait for GC
|
||||
end
|
||||
end
|
||||
|
||||
execute(con::Connection, sql::AbstractString; kwargs...) = execute(con, sql, values(kwargs))
|
||||
execute(db::DB, sql::AbstractString, params::DBInterface.StatementParams) = execute(db.main_connection, sql, params)
|
||||
execute(db::DB, sql::AbstractString; kwargs...) = execute(db.main_connection, sql, values(kwargs))
|
||||
|
||||
|
||||
Tables.istable(::Type{QueryResult}) = true
|
||||
Tables.isrowtable(::Type{QueryResult}) = true
|
||||
Tables.columnaccess(::Type{QueryResult}) = true
|
||||
Tables.schema(q::QueryResult) = Tables.Schema(q.names, q.types)
|
||||
Base.IteratorSize(::Type{QueryResult}) = Base.SizeUnknown()
|
||||
Base.eltype(q::QueryResult) = Any
|
||||
|
||||
DBInterface.close!(q::QueryResult) = _close_result(q)
|
||||
|
||||
Base.iterate(q::QueryResult) = iterate(Tables.rows(Tables.columns(q)))
|
||||
Base.iterate(q::QueryResult, state) = iterate(Tables.rows(Tables.columns(q)), state)
|
||||
|
||||
struct QueryResultChunk
|
||||
tbl::NamedTuple
|
||||
end
|
||||
|
||||
function Tables.columns(chunk::QueryResultChunk)
|
||||
return Tables.CopiedColumns(chunk.tbl)
|
||||
end
|
||||
|
||||
Tables.istable(::Type{QueryResultChunk}) = true
|
||||
Tables.isrowtable(::Type{QueryResultChunk}) = true
|
||||
Tables.columnaccess(::Type{QueryResultChunk}) = true
|
||||
Tables.schema(chunk::QueryResultChunk) = Tables.Schema(chunk.q.names, chunk.q.types)
|
||||
|
||||
struct QueryResultChunkIterator
|
||||
q::QueryResult
|
||||
column_count::Int64
|
||||
end
|
||||
|
||||
function next_chunk(iter::QueryResultChunkIterator)
|
||||
chunk = DuckDB.nextDataChunk(iter.q)
|
||||
if chunk === missing
|
||||
return nothing
|
||||
end
|
||||
|
||||
return QueryResultChunk(convert_columns(iter.q, (chunk,), iter.column_count))
|
||||
end
|
||||
|
||||
Base.iterate(iter::QueryResultChunkIterator) = iterate(iter, 0x0000000000000001)
|
||||
|
||||
function Base.iterate(iter::QueryResultChunkIterator, state)
|
||||
if iter.q.chunk_index != state
|
||||
throw(
|
||||
NotImplementedException(
|
||||
"Iterating chunks more than once is not supported. " *
|
||||
"(Did you iterate the result of Tables.partitions() once already, call nextDataChunk or materialise QueryResult?)"
|
||||
)
|
||||
)
|
||||
end
|
||||
chunk = next_chunk(iter)
|
||||
if chunk === nothing
|
||||
return nothing
|
||||
end
|
||||
return (chunk, state + 1)
|
||||
end
|
||||
|
||||
Base.IteratorSize(::Type{QueryResultChunkIterator}) = Base.SizeUnknown()
|
||||
Base.eltype(iter::QueryResultChunkIterator) = Any
|
||||
|
||||
function Tables.partitions(q::QueryResult)
|
||||
column_count = duckdb_column_count(q.handle)
|
||||
return QueryResultChunkIterator(q, column_count)
|
||||
end
|
||||
|
||||
function nextDataChunk(q::QueryResult)::Union{Missing, DataChunk}
|
||||
if duckdb_result_is_streaming(q.handle[])
|
||||
chunk_handle = duckdb_stream_fetch_chunk(q.handle[])
|
||||
if chunk_handle == C_NULL
|
||||
return missing
|
||||
end
|
||||
chunk = DataChunk(chunk_handle, true)
|
||||
if get_size(chunk) == 0
|
||||
return missing
|
||||
end
|
||||
else
|
||||
chunk_count = duckdb_result_chunk_count(q.handle[])
|
||||
if q.chunk_index > chunk_count
|
||||
return missing
|
||||
end
|
||||
chunk = DataChunk(duckdb_result_get_chunk(q.handle[], q.chunk_index), true)
|
||||
end
|
||||
q.chunk_index += 1
|
||||
return chunk
|
||||
end
|
||||
|
||||
"Return the last row insert id from the executed statement"
|
||||
DBInterface.lastrowid(con::Connection) = throw(NotImplementedException("Unimplemented: lastrowid"))
|
||||
DBInterface.lastrowid(db::DB) = DBInterface.lastrowid(db.main_connection)
|
||||
|
||||
"""
|
||||
DBInterface.prepare(db::DuckDB.DB, sql::AbstractString)
|
||||
|
||||
Prepare an SQL statement given as a string in the DuckDB database; returns a `DuckDB.Stmt` object.
|
||||
See `DBInterface.execute`(@ref) for information on executing a prepared statement and passing parameters to bind.
|
||||
A `DuckDB.Stmt` object can be closed (resources freed) using `DBInterface.close!`(@ref).
|
||||
"""
|
||||
DBInterface.prepare(con::Connection, sql::AbstractString, result_type::Type) = Stmt(con, sql, result_type)
|
||||
DBInterface.prepare(con::Connection, sql::AbstractString) = DBInterface.prepare(con, sql, MaterializedResult)
|
||||
DBInterface.prepare(db::DB, sql::AbstractString) = DBInterface.prepare(db.main_connection, sql)
|
||||
DBInterface.prepare(db::DB, sql::AbstractString, result_type::Type) =
|
||||
DBInterface.prepare(db.main_connection, sql, result_type)
|
||||
|
||||
"""
|
||||
DBInterface.execute(db::DuckDB.DB, sql::String, [params])
|
||||
DBInterface.execute(stmt::SQLite.Stmt, [params])
|
||||
|
||||
Bind any positional (`params` as `Vector` or `Tuple`) or named (`params` as `NamedTuple` or `Dict`) parameters to an SQL statement, given by `db` and `sql` or
|
||||
as an already prepared statement `stmt`, execute the query and return an iterator of result rows.
|
||||
|
||||
Note that the returned result row iterator only supports a single-pass, forward-only iteration of the result rows.
|
||||
Calling `SQLite.reset!(result)` will re-execute the query and reset the iterator back to the beginning.
|
||||
|
||||
The resultset iterator supports the [Tables.jl](https://github.com/JuliaData/Tables.jl) interface, so results can be collected in any Tables.jl-compatible sink,
|
||||
like `DataFrame(results)`, `CSV.write("results.csv", results)`, etc.
|
||||
"""
|
||||
DBInterface.execute(stmt::Stmt, params::DBInterface.StatementParams) = execute(stmt, params)
|
||||
function DBInterface.execute(con::Connection, sql::AbstractString, result_type::Type)
|
||||
stmt = Stmt(con, sql, result_type)
|
||||
try
|
||||
return execute(stmt)
|
||||
finally
|
||||
_close_stmt(stmt) # immediately close, don't wait for GC
|
||||
end
|
||||
end
|
||||
DBInterface.execute(con::Connection, sql::AbstractString) = DBInterface.execute(con, sql, MaterializedResult)
|
||||
DBInterface.execute(db::DB, sql::AbstractString, result_type::Type) =
|
||||
DBInterface.execute(db.main_connection, sql, result_type)
|
||||
|
||||
Base.show(io::IO, result::DuckDB.QueryResult) = print(io, Tables.columntable(result))
|
||||
|
||||
"""
|
||||
Executes a SQL query within a connection and returns the full (materialized) result.
|
||||
|
||||
The query function is able to run queries with multiple statements, unlike `DBInterface.execute`(@ref) which is only able to prepare a single statement.
|
||||
"""
|
||||
function query(con::DuckDB.Connection, sql::AbstractString)
|
||||
handle = Ref{duckdb_result}()
|
||||
ret = duckdb_query(con.handle, sql, handle)
|
||||
if ret != DuckDBSuccess
|
||||
error_ptr = duckdb_result_error(handle)
|
||||
error_message = fetch_error(sql, error_ptr)
|
||||
duckdb_destroy_result(handle)
|
||||
throw(QueryException(error_message))
|
||||
end
|
||||
return QueryResult(handle)
|
||||
end
|
||||
query(db::DuckDB.DB, sql::AbstractString) = query(db.main_connection, sql)
|
||||
387
external/duckdb/tools/juliapkg/src/scalar_function.jl
vendored
Normal file
387
external/duckdb/tools/juliapkg/src/scalar_function.jl
vendored
Normal file
@@ -0,0 +1,387 @@
|
||||
#=
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Scalar Function
|
||||
//===--------------------------------------------------------------------===//
|
||||
=#
|
||||
"""
|
||||
ScalarFunction(
|
||||
name::AbstractString,
|
||||
parameters::Vector{DataType},
|
||||
return_type::DataType,
|
||||
func,
|
||||
wrapper = nothing
|
||||
)
|
||||
|
||||
Creates a new scalar function object. It is recommended to use the `@create_scalar_function`
|
||||
macro to create a new scalar function instead of calling this constructor directly.
|
||||
|
||||
# Arguments
|
||||
- `name::AbstractString`: The name of the function.
|
||||
- `parameters::Vector{DataType}`: The data types of the parameters.
|
||||
- `return_type::DataType`: The return type of the function.
|
||||
- `func`: The function to be called.
|
||||
- `wrapper`: The wrapper function that is used to call the function from DuckDB.
|
||||
- `wrapper_id`: A unique id for the wrapper function.
|
||||
|
||||
See also [`register_scalar_function`](@ref), [`@create_scalar_function`](@ref)
|
||||
"""
|
||||
mutable struct ScalarFunction
|
||||
handle::duckdb_scalar_function
|
||||
name::AbstractString
|
||||
parameters::Vector{DataType}
|
||||
return_type::DataType
|
||||
logical_parameters::Vector{LogicalType}
|
||||
logical_return_type::LogicalType
|
||||
func::Function
|
||||
wrapper::Union{Nothing, Function} # the wrapper function to hold a reference to it to prevent GC
|
||||
wrapper_id::Union{Nothing, UInt64}
|
||||
|
||||
function ScalarFunction(
|
||||
name::AbstractString,
|
||||
parameters::Vector{DataType},
|
||||
return_type::DataType,
|
||||
func,
|
||||
wrapper = nothing,
|
||||
wrapper_id = nothing
|
||||
)
|
||||
handle = duckdb_create_scalar_function()
|
||||
duckdb_scalar_function_set_name(handle, name)
|
||||
|
||||
logical_parameters = Vector{LogicalType}()
|
||||
for parameter_type in parameters
|
||||
push!(logical_parameters, create_logical_type(parameter_type))
|
||||
end
|
||||
logical_return_type = create_logical_type(return_type)
|
||||
|
||||
for param in logical_parameters
|
||||
duckdb_scalar_function_add_parameter(handle, param.handle)
|
||||
end
|
||||
duckdb_scalar_function_set_return_type(handle, logical_return_type.handle)
|
||||
result = new(
|
||||
handle,
|
||||
name,
|
||||
parameters,
|
||||
return_type,
|
||||
logical_parameters,
|
||||
logical_return_type,
|
||||
func,
|
||||
wrapper,
|
||||
wrapper_id
|
||||
)
|
||||
finalizer(_destroy_scalar_function, result)
|
||||
duckdb_scalar_function_set_extra_info(handle, pointer_from_objref(result), C_NULL)
|
||||
|
||||
return result
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
name(func::ScalarFunction) = func.name
|
||||
signature(func::ScalarFunction) = string(func.name, "(", join(func.parameters, ", "), ") -> ", func.return_type)
|
||||
|
||||
function Base.show(io::IO, func::ScalarFunction)
|
||||
print(io, "DuckDB.ScalarFunction(", signature(func), ")")
|
||||
return
|
||||
end
|
||||
|
||||
function _destroy_scalar_function(func::ScalarFunction)
|
||||
# disconnect from DB
|
||||
if func.handle != C_NULL
|
||||
duckdb_destroy_scalar_function(func.handle)
|
||||
end
|
||||
|
||||
# remove the wrapper from the cache
|
||||
if func.wrapper_id !== nothing && func.wrapper_id in keys(_UDF_WRAPPER_CACHE)
|
||||
delete!(_UDF_WRAPPER_CACHE, func.wrapper_id)
|
||||
end
|
||||
|
||||
func.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
|
||||
"""
|
||||
register_scalar_function(db::DB, fun::ScalarFunction)
|
||||
register_scalar_function(con::Connection, fun::ScalarFunction)
|
||||
|
||||
Register a scalar function in the database.
|
||||
"""
|
||||
register_scalar_function(db::DB, fun::ScalarFunction) = register_scalar_function(db.main_connection, fun)
|
||||
function register_scalar_function(con::Connection, fun::ScalarFunction)
|
||||
|
||||
if fun.name in keys(con.db.scalar_functions)
|
||||
throw(ArgumentError(string("Scalar function \"", fun.name, "\" already registered")))
|
||||
end
|
||||
|
||||
result = duckdb_register_scalar_function(con.handle, fun.handle)
|
||||
if result != DuckDBSuccess
|
||||
throw(ArgumentError(string("Failed to register scalar function \"", fun.name, "\"")))
|
||||
end
|
||||
|
||||
con.db.scalar_functions[fun.name] = fun
|
||||
return
|
||||
end
|
||||
|
||||
|
||||
# %% --- Scalar Function Macro ------------------------------------------ #
|
||||
|
||||
"""
|
||||
name, at, rt = _udf_parse_function_expr(expr::Expr)
|
||||
|
||||
Parses a function expression and returns the function name, parameters and return type.
|
||||
The parameters are turned as a vector of argument name, argument type tuples.
|
||||
|
||||
# Example
|
||||
|
||||
```julia
|
||||
expr = :(my_sum(a::Int, b::Int)::Int)
|
||||
name, at, rt = _udf_parse_function_expr(expr)
|
||||
```
|
||||
"""
|
||||
function _udf_parse_function_expr(expr::Expr)
|
||||
|
||||
function parse_parameter(parameter_expr::Expr)
|
||||
parameter_expr.head === :(::) || throw(ArgumentError("parameter_expr must be a type annotation"))
|
||||
parameter, parameter_type = parameter_expr.args
|
||||
|
||||
if !isa(parameter, Symbol)
|
||||
throw(ArgumentError("parameter name must be a symbol"))
|
||||
end
|
||||
|
||||
# if !isa(parameter_type, Symbol)
|
||||
# throw(ArgumentError("parameter_type must be a symbol"))
|
||||
# end
|
||||
|
||||
return parameter, parameter_type
|
||||
end
|
||||
|
||||
expr.head === :(::) ||
|
||||
throw(ArgumentError("expr must be a typed function signature, e.g. func(a::Int, b::String)::Int"))
|
||||
inner, return_type = expr.args
|
||||
|
||||
# parse inner
|
||||
if !isa(inner, Expr)
|
||||
throw(ArgumentError("inner must be an expression"))
|
||||
end
|
||||
|
||||
inner.head === :call ||
|
||||
throw(ArgumentError("expr must be a typed function signature, e.g. func(a::Int, b::String)::Int"))
|
||||
func_name = inner.args[1]
|
||||
parameters = parse_parameter.(inner.args[2:(end)])
|
||||
return func_name, parameters, return_type
|
||||
end
|
||||
|
||||
function _udf_generate_conversion_expressions(parameters, logical_type, convert, var_name, chunk_name)
|
||||
|
||||
# Example:
|
||||
# data_1 = convert(Int, LT[1], chunk, 1)
|
||||
var_names = [Symbol("$(var_name)_$(i)") for i in 1:length(parameters)]
|
||||
expressions = [
|
||||
Expr(:(=), var_names[i], Expr(:call, convert, p_type, Expr(:ref, logical_type, i), chunk_name, i)) for
|
||||
(i, (p_name, p_type)) in enumerate(parameters)
|
||||
]
|
||||
return var_names, expressions
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
function _udf_generate_wrapper(func_expr, func_esc)
|
||||
index_name = :i
|
||||
log_param_types_name = :log_param_types
|
||||
log_return_type_name = :log_return_type
|
||||
|
||||
# Parse the function definition, e.g. my_func(a::Int, b::String)::Int
|
||||
func, parameters, return_type = _udf_parse_function_expr(func_expr)
|
||||
func_name = string(func)
|
||||
|
||||
|
||||
# Generate expressions to unpack the data chunk:
|
||||
# param_1 = convert(Int, LT[1], chunk, 1)
|
||||
# param_2 = convert(Int, LT[2], chunk, 2)
|
||||
var_names, input_assignments =
|
||||
_udf_generate_conversion_expressions(parameters, log_param_types_name, :_udf_convert_chunk, :param, :chunk)
|
||||
|
||||
|
||||
# Generate the call expression: result = func(param_1, param_2, ...)
|
||||
call_args_loop = [:($var_name[$index_name]) for var_name in var_names]
|
||||
call_expr = Expr(:call, func_esc, call_args_loop...)
|
||||
|
||||
# Generate the validity expression: get_validity(chunk, i)
|
||||
validity_expr_i = i -> Expr(:call, :get_validity, :chunk, i)
|
||||
validity_expr = Expr(:tuple, (validity_expr_i(i) for i in 1:length(parameters))...)
|
||||
|
||||
return quote
|
||||
function (info::DuckDB.duckdb_function_info, input::DuckDB.duckdb_data_chunk, output::DuckDB.duckdb_vector)
|
||||
|
||||
extra_info_ptr = DuckDB.duckdb_scalar_function_get_extra_info(info)
|
||||
scalar_func::DuckDB.ScalarFunction = unsafe_pointer_to_objref(extra_info_ptr)
|
||||
$log_param_types_name::Vector{LogicalType} = scalar_func.logical_parameters
|
||||
$log_return_type_name::LogicalType = scalar_func.logical_return_type
|
||||
|
||||
try
|
||||
vec = Vec(output)
|
||||
chunk = DataChunk(input, false) # create a data chunk object, that does not own the data
|
||||
$(input_assignments...) # Assign the input values
|
||||
N = Int64(get_size(chunk))
|
||||
|
||||
# initialize the result container, to avoid calling get_array() in the loop
|
||||
result_container = _udf_assign_result_init($return_type, vec)
|
||||
|
||||
# Check data validity
|
||||
validity = $validity_expr
|
||||
chunk_is_valid = all(all_valid.(validity))
|
||||
result_validity = get_validity(vec)
|
||||
|
||||
for $index_name in 1:N
|
||||
if chunk_is_valid || all(isvalid(v, $index_name) for v in validity)
|
||||
result::$return_type = $call_expr
|
||||
|
||||
# Hopefully this optimized away if the type has no missing values
|
||||
if ismissing(result)
|
||||
setinvalid(result_validity, $index_name)
|
||||
else
|
||||
_udf_assign_result!(result_container, $return_type, vec, result, $index_name)
|
||||
end
|
||||
else
|
||||
setinvalid(result_validity, $index_name)
|
||||
end
|
||||
end
|
||||
return nothing
|
||||
catch e
|
||||
duckdb_scalar_function_set_error(
|
||||
info,
|
||||
"Exception in " * signature(scalar_func) * ": " * get_exception_info()
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
"""
|
||||
Internal storage to have globally accessible functions pointers.
|
||||
|
||||
HACK: This is a workaround to dynamically generate a function pointer on ALL architectures.
|
||||
"""
|
||||
const _UDF_WRAPPER_CACHE = Dict{UInt64, Function}()
|
||||
|
||||
function _udf_register_wrapper(id, wrapper)
|
||||
|
||||
|
||||
if id in keys(_UDF_WRAPPER_CACHE)
|
||||
throw(
|
||||
InvalidInputException(
|
||||
"A function with the same id has already been registered. This should not happen. Please report this issue."
|
||||
)
|
||||
)
|
||||
end
|
||||
|
||||
_UDF_WRAPPER_CACHE[id] = wrapper
|
||||
|
||||
# HACK: This is a workaround to dynamically generate a function pointer on ALL architectures
|
||||
# We need to delay the cfunction call until the moment wrapper function is generated
|
||||
fptr = QuoteNode(:(_UDF_WRAPPER_CACHE[$id]))
|
||||
cfunction_type = Ptr{Cvoid}
|
||||
rt = :Cvoid
|
||||
at = :(duckdb_function_info, duckdb_data_chunk, duckdb_vector)
|
||||
attr_svec = Expr(:call, GlobalRef(Core, :svec), at.args...)
|
||||
cfun = Expr(:cfunction, cfunction_type, fptr, rt, attr_svec, QuoteNode(:ccall))
|
||||
ptr = eval(cfun)
|
||||
return ptr
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
"""
|
||||
@create_scalar_function func_expr [func_ref]
|
||||
|
||||
Creates a new Scalar Function object that can be registered in a DuckDB database.
|
||||
|
||||
# Arguments
|
||||
- `func_expr`: An expression that defines the function signature.
|
||||
- `func_def`: An optional definition of the function or a closure. If omitted, it is assumed that a function with same name given in `func_expr` is defined in the global scope.
|
||||
|
||||
|
||||
# Example
|
||||
|
||||
```julia
|
||||
db = DuckDB.DB()
|
||||
my_add(a,b) = a + b
|
||||
fun = @create_scalar_function my_add(a::Int, b::Int)::Int
|
||||
DuckDB.register_scalar_function(db, fun) # Register UDF
|
||||
```
|
||||
|
||||
"""
|
||||
macro create_scalar_function(func_expr, func_ref = nothing)
|
||||
func, parameters, return_type = _udf_parse_function_expr(func_expr)
|
||||
if func_ref !== nothing
|
||||
func_esc = esc(func_ref)
|
||||
else
|
||||
func_esc = esc(func)
|
||||
end
|
||||
#@info "Create Scalar Function" func func_esc, parameters, return_type
|
||||
func_name = string(func)
|
||||
parameter_names = [p[1] for p in parameters]
|
||||
parameter_types = [p[2] for p in parameters]
|
||||
parameter_types_vec = Expr(:vect, parameter_types...) # create a vector expression, e.g. [Int, Int]
|
||||
wrapper_expr = _udf_generate_wrapper(func_expr, func_esc)
|
||||
|
||||
id = hash((func_expr, rand(UInt64))) # generate a unique id for the function
|
||||
|
||||
return quote
|
||||
local wrapper = $(wrapper_expr)
|
||||
local fun = ScalarFunction($func_name, $parameter_types_vec, $return_type, $func_esc, wrapper, $id)
|
||||
|
||||
ptr = _udf_register_wrapper($id, wrapper)
|
||||
# Everything below only works in GLOBAL scope in the repl
|
||||
# ptr = @cfunction(fun.wrapper, Cvoid, (duckdb_function_info, duckdb_data_chunk, duckdb_vector))
|
||||
duckdb_scalar_function_set_function(fun.handle, ptr)
|
||||
fun
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
# %% --- Conversions ------------------------------------------ #
|
||||
|
||||
|
||||
function _udf_assign_result_init(::Type{T}, vec::Vec) where {T}
|
||||
T_internal = julia_to_duck_type(T)
|
||||
arr = get_array(vec, T_internal) # this call is quite slow, so we only call it once
|
||||
return arr
|
||||
end
|
||||
|
||||
function _udf_assign_result_init(::Type{T}, vec::Vec) where {T <: AbstractString}
|
||||
return nothing
|
||||
end
|
||||
|
||||
function _udf_assign_result!(container, ::Type{T}, vec::Vec, result::T, index) where {T}
|
||||
container[index] = value_to_duckdb(result) # convert the value to duckdb and assign it to the array
|
||||
return nothing
|
||||
end
|
||||
|
||||
function _udf_assign_result!(container, ::Type{T}, vec::Vec, result::T, index) where {T <: AbstractString}
|
||||
s = string(result)
|
||||
DuckDB.assign_string_element(vec, index, s)
|
||||
return nothing
|
||||
end
|
||||
|
||||
|
||||
function _udf_convert_chunk(::Type{T}, lt::LogicalType, chunk::DataChunk, ix) where {T <: Number}
|
||||
x::Vector{T} = get_array(chunk, ix, T)
|
||||
return x
|
||||
end
|
||||
|
||||
function _udf_convert_chunk(::Type{T}, lt::LogicalType, chunk::DataChunk, ix) where {T <: AbstractString}
|
||||
data = ColumnConversionData((chunk,), ix, lt, nothing)
|
||||
return convert_column(data)
|
||||
end
|
||||
|
||||
function _udf_convert_chunk(::Type{T}, lt::LogicalType, chunk::DataChunk, ix) where {T}
|
||||
data = ColumnConversionData((chunk,), ix, lt, nothing)
|
||||
return convert_column(data)
|
||||
end
|
||||
113
external/duckdb/tools/juliapkg/src/statement.jl
vendored
Normal file
113
external/duckdb/tools/juliapkg/src/statement.jl
vendored
Normal file
@@ -0,0 +1,113 @@
|
||||
mutable struct Stmt <: DBInterface.Statement
|
||||
con::Connection
|
||||
handle::duckdb_prepared_statement
|
||||
sql::AbstractString
|
||||
result_type::Type
|
||||
|
||||
function Stmt(con::Connection, sql::AbstractString, result_type::Type)
|
||||
handle = Ref{duckdb_prepared_statement}()
|
||||
result = duckdb_prepare(con.handle, sql, handle)
|
||||
if result != DuckDBSuccess
|
||||
ptr = duckdb_prepare_error(handle[])
|
||||
if ptr == C_NULL
|
||||
error_message = "Preparation of statement failed: unknown error"
|
||||
else
|
||||
error_message = unsafe_string(ptr)
|
||||
end
|
||||
duckdb_destroy_prepare(handle)
|
||||
throw(QueryException(error_message))
|
||||
end
|
||||
stmt = new(con, handle[], sql, result_type)
|
||||
finalizer(_close_stmt, stmt)
|
||||
return stmt
|
||||
end
|
||||
|
||||
function Stmt(db::DB, sql::AbstractString, result_type::Type)
|
||||
return Stmt(db.main_connection, sql, result_type)
|
||||
end
|
||||
end
|
||||
|
||||
function _close_stmt(stmt::Stmt)
|
||||
if stmt.handle != C_NULL
|
||||
duckdb_destroy_prepare(stmt.handle)
|
||||
end
|
||||
stmt.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
DBInterface.getconnection(stmt::Stmt) = stmt.con
|
||||
|
||||
|
||||
function nparameters(stmt::Stmt)
|
||||
return Int(duckdb_nparams(stmt.handle))
|
||||
end
|
||||
|
||||
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::AbstractFloat) = duckdb_bind_double(stmt.handle, i, Float64(val));
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Bool) = duckdb_bind_boolean(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Int8) = duckdb_bind_int8(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Int16) = duckdb_bind_int16(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Int32) = duckdb_bind_int32(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Int64) = duckdb_bind_int64(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::UInt8) = duckdb_bind_uint8(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::UInt16) = duckdb_bind_uint16(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::UInt32) = duckdb_bind_uint32(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::UInt64) = duckdb_bind_uint64(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Float32) = duckdb_bind_float(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Float64) = duckdb_bind_double(stmt.handle, i, val);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Date) = duckdb_bind_date(stmt.handle, i, value_to_duckdb(val));
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Time) = duckdb_bind_time(stmt.handle, i, value_to_duckdb(val));
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::DateTime) =
|
||||
duckdb_bind_timestamp(stmt.handle, i, value_to_duckdb(val));
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Missing) = duckdb_bind_null(stmt.handle, i);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Nothing) = duckdb_bind_null(stmt.handle, i);
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::AbstractString) =
|
||||
duckdb_bind_varchar_length(stmt.handle, i, val, ncodeunits(val));
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::Vector{UInt8}) = duckdb_bind_blob(stmt.handle, i, val, sizeof(val));
|
||||
duckdb_bind_internal(stmt::Stmt, i::Integer, val::WeakRefString{UInt8}) =
|
||||
duckdb_bind_varchar_length(stmt.handle, i, val.ptr, val.len);
|
||||
function duckdb_bind_internal(stmt::Stmt, i::Integer, val::AbstractVector{T}) where {T}
|
||||
value = create_value(val)
|
||||
return duckdb_bind_value(stmt.handle, i, value.handle)
|
||||
end
|
||||
|
||||
function duckdb_bind_internal(stmt::Stmt, i::Integer, val::Any)
|
||||
println(val)
|
||||
throw(NotImplementedException("unsupported type for bind"))
|
||||
end
|
||||
|
||||
function bind_parameters(stmt::Stmt, params::DBInterface.PositionalStatementParams)
|
||||
i = 1
|
||||
for param in params
|
||||
if duckdb_bind_internal(stmt, i, param) != DuckDBSuccess
|
||||
throw(QueryException("Failed to bind parameter"))
|
||||
end
|
||||
i += 1
|
||||
end
|
||||
end
|
||||
|
||||
function bind_parameters(stmt::Stmt, params::DBInterface.NamedStatementParams)
|
||||
N = nparameters(stmt)
|
||||
if length(params) == 0
|
||||
return # no parameters to bind
|
||||
end
|
||||
K = eltype(keys(params))
|
||||
for i in 1:N
|
||||
name_ptr = duckdb_parameter_name(stmt.handle, i)
|
||||
name = unsafe_string(name_ptr)
|
||||
duckdb_free(name_ptr)
|
||||
name_key = K(name)
|
||||
if !haskey(params, name_key)
|
||||
if isa(params, NamedTuple)
|
||||
value = params[i] # FIXME this is a workaround to keep the interface consistent, see the test in test_sqlite.jl
|
||||
else
|
||||
throw(QueryException("Parameter '$name' not found"))
|
||||
end
|
||||
else
|
||||
value = getindex(params, name_key)
|
||||
end
|
||||
if duckdb_bind_internal(stmt, i, value) != DuckDBSuccess
|
||||
throw(QueryException("Failed to bind parameter '$name'"))
|
||||
end
|
||||
end
|
||||
end
|
||||
369
external/duckdb/tools/juliapkg/src/table_function.jl
vendored
Normal file
369
external/duckdb/tools/juliapkg/src/table_function.jl
vendored
Normal file
@@ -0,0 +1,369 @@
|
||||
#=
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Table Function Bind
|
||||
//===--------------------------------------------------------------------===//
|
||||
=#
|
||||
struct BindInfo
|
||||
handle::duckdb_bind_info
|
||||
main_function::Any
|
||||
|
||||
function BindInfo(handle::duckdb_bind_info, main_function)
|
||||
result = new(handle, main_function)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
mutable struct InfoWrapper
|
||||
main_function::Any
|
||||
info::Any
|
||||
|
||||
function InfoWrapper(main_function, info)
|
||||
return new(main_function, info)
|
||||
end
|
||||
end
|
||||
|
||||
function parameter_count(bind_info::BindInfo)
|
||||
return duckdb_bind_get_parameter_count(bind_info.handle)
|
||||
end
|
||||
|
||||
function get_parameter(bind_info::BindInfo, index::Int64)
|
||||
return Value(duckdb_bind_get_parameter(bind_info.handle, index))
|
||||
end
|
||||
|
||||
function set_stats_cardinality(bind_info::BindInfo, cardinality::UInt64, is_exact::Bool)
|
||||
duckdb_bind_set_cardinality(bind_info.handle, cardinality, is_exact)
|
||||
return
|
||||
end
|
||||
|
||||
function add_result_column(bind_info::BindInfo, name::AbstractString, type::DataType)
|
||||
return add_result_column(bind_info, name, create_logical_type(type))
|
||||
end
|
||||
|
||||
function add_result_column(bind_info::BindInfo, name::AbstractString, type::LogicalType)
|
||||
return duckdb_bind_add_result_column(bind_info.handle, name, type.handle)
|
||||
end
|
||||
|
||||
function get_extra_data(bind_info::BindInfo)
|
||||
return bind_info.main_function.extra_data
|
||||
end
|
||||
|
||||
function _add_global_object(main_function, object)
|
||||
begin
|
||||
lock(main_function.global_lock)
|
||||
try
|
||||
push!(main_function.global_objects, object)
|
||||
finally
|
||||
unlock(main_function.global_lock)
|
||||
end
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
function _remove_global_object(main_function, object)
|
||||
begin
|
||||
lock(main_function.global_lock)
|
||||
try
|
||||
delete!(main_function.global_objects, object)
|
||||
finally
|
||||
unlock(main_function.global_lock)
|
||||
end
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
function _table_bind_cleanup(data::Ptr{Cvoid})
|
||||
info::InfoWrapper = unsafe_pointer_to_objref(data)
|
||||
_remove_global_object(info.main_function, info)
|
||||
return
|
||||
end
|
||||
|
||||
function get_exception_info()
|
||||
error = ""
|
||||
for (exc, bt) in current_exceptions()
|
||||
error = string(error, sprint(showerror, exc, bt))
|
||||
end
|
||||
return error
|
||||
end
|
||||
|
||||
function _table_bind_function(info::duckdb_bind_info)
|
||||
try
|
||||
main_function = unsafe_pointer_to_objref(duckdb_bind_get_extra_info(info))
|
||||
binfo = BindInfo(info, main_function)
|
||||
bind_data = InfoWrapper(main_function, main_function.bind_func(binfo))
|
||||
bind_data_pointer = pointer_from_objref(bind_data)
|
||||
_add_global_object(main_function, bind_data)
|
||||
duckdb_bind_set_bind_data(info, bind_data_pointer, @cfunction(_table_bind_cleanup, Cvoid, (Ptr{Cvoid},)))
|
||||
catch
|
||||
duckdb_bind_set_error(info, get_exception_info())
|
||||
return
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
#=
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Table Function Init
|
||||
//===--------------------------------------------------------------------===//
|
||||
=#
|
||||
struct InitInfo
|
||||
handle::duckdb_init_info
|
||||
main_function::Any
|
||||
|
||||
function InitInfo(handle::duckdb_init_info, main_function)
|
||||
result = new(handle, main_function)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _table_init_function_generic(info::duckdb_init_info, init_fun::Function)
|
||||
try
|
||||
main_function = unsafe_pointer_to_objref(duckdb_init_get_extra_info(info))
|
||||
binfo = InitInfo(info, main_function)
|
||||
init_data = InfoWrapper(main_function, init_fun(binfo))
|
||||
init_data_pointer = pointer_from_objref(init_data)
|
||||
_add_global_object(main_function, init_data)
|
||||
duckdb_init_set_init_data(info, init_data_pointer, @cfunction(_table_bind_cleanup, Cvoid, (Ptr{Cvoid},)))
|
||||
catch
|
||||
duckdb_init_set_error(info, get_exception_info())
|
||||
return
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
function _table_init_function(info::duckdb_init_info)
|
||||
main_function = unsafe_pointer_to_objref(duckdb_init_get_extra_info(info))
|
||||
return _table_init_function_generic(info, main_function.init_func)
|
||||
end
|
||||
|
||||
function _table_local_init_function(info::duckdb_init_info)
|
||||
main_function = unsafe_pointer_to_objref(duckdb_init_get_extra_info(info))
|
||||
return _table_init_function_generic(info, main_function.init_local_func)
|
||||
end
|
||||
|
||||
function get_bind_info(info::InitInfo, ::Type{T})::T where {T}
|
||||
return unsafe_pointer_to_objref(duckdb_init_get_bind_data(info.handle)).info
|
||||
end
|
||||
|
||||
function get_extra_data(info::InitInfo)
|
||||
return info.main_function.extra_data
|
||||
end
|
||||
|
||||
function set_max_threads(info::InitInfo, max_threads)
|
||||
return duckdb_init_set_max_threads(info.handle, max_threads)
|
||||
end
|
||||
|
||||
function get_projected_columns(info::InitInfo)::Vector{Int64}
|
||||
result::Vector{Int64} = Vector()
|
||||
column_count = duckdb_init_get_column_count(info.handle)
|
||||
for i in 1:column_count
|
||||
push!(result, duckdb_init_get_column_index(info.handle, i))
|
||||
end
|
||||
return result
|
||||
end
|
||||
|
||||
function _empty_init_info(info::DuckDB.InitInfo)
|
||||
return missing
|
||||
end
|
||||
|
||||
#=
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Main Table Function
|
||||
//===--------------------------------------------------------------------===//
|
||||
=#
|
||||
struct FunctionInfo
|
||||
handle::duckdb_function_info
|
||||
main_function::Any
|
||||
|
||||
function FunctionInfo(handle::duckdb_function_info, main_function)
|
||||
result = new(handle, main_function)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function get_bind_info(info::FunctionInfo, ::Type{T})::T where {T}
|
||||
return unsafe_pointer_to_objref(duckdb_function_get_bind_data(info.handle)).info
|
||||
end
|
||||
|
||||
function get_init_info(info::FunctionInfo, ::Type{T})::T where {T}
|
||||
return unsafe_pointer_to_objref(duckdb_function_get_init_data(info.handle)).info
|
||||
end
|
||||
|
||||
function get_local_info(info::FunctionInfo, ::Type{T})::T where {T}
|
||||
return unsafe_pointer_to_objref(duckdb_function_get_local_init_data(info.handle)).info
|
||||
end
|
||||
|
||||
function _table_main_function(info::duckdb_function_info, chunk::duckdb_data_chunk)
|
||||
main_function::TableFunction = unsafe_pointer_to_objref(duckdb_function_get_extra_info(info))
|
||||
binfo::FunctionInfo = FunctionInfo(info, main_function)
|
||||
try
|
||||
main_function.main_func(binfo, DataChunk(chunk, false))
|
||||
catch
|
||||
duckdb_function_set_error(info, get_exception_info())
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
#=
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Table Function
|
||||
//===--------------------------------------------------------------------===//
|
||||
=#
|
||||
"""
|
||||
DuckDB table function
|
||||
"""
|
||||
mutable struct TableFunction
|
||||
handle::duckdb_table_function
|
||||
bind_func::Function
|
||||
init_func::Function
|
||||
init_local_func::Function
|
||||
main_func::Function
|
||||
extra_data::Any
|
||||
global_objects::Set{Any}
|
||||
global_lock::ReentrantLock
|
||||
|
||||
function TableFunction(
|
||||
name::AbstractString,
|
||||
parameters::Vector{LogicalType},
|
||||
bind_func::Function,
|
||||
init_func::Function,
|
||||
init_local_func::Function,
|
||||
main_func::Function,
|
||||
extra_data::Any,
|
||||
projection_pushdown::Bool
|
||||
)
|
||||
handle = duckdb_create_table_function()
|
||||
duckdb_table_function_set_name(handle, name)
|
||||
for param in parameters
|
||||
duckdb_table_function_add_parameter(handle, param.handle)
|
||||
end
|
||||
result = new(handle, bind_func, init_func, init_local_func, main_func, extra_data, Set(), ReentrantLock())
|
||||
finalizer(_destroy_table_function, result)
|
||||
|
||||
duckdb_table_function_set_extra_info(handle, pointer_from_objref(result), C_NULL)
|
||||
duckdb_table_function_set_bind(handle, @cfunction(_table_bind_function, Cvoid, (duckdb_bind_info,)))
|
||||
duckdb_table_function_set_init(handle, @cfunction(_table_init_function, Cvoid, (duckdb_init_info,)))
|
||||
duckdb_table_function_set_local_init(handle, @cfunction(_table_local_init_function, Cvoid, (duckdb_init_info,)))
|
||||
duckdb_table_function_set_function(
|
||||
handle,
|
||||
@cfunction(_table_main_function, Cvoid, (duckdb_function_info, duckdb_data_chunk))
|
||||
)
|
||||
duckdb_table_function_supports_projection_pushdown(handle, projection_pushdown)
|
||||
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _destroy_table_function(func::TableFunction)
|
||||
# disconnect from DB
|
||||
if func.handle != C_NULL
|
||||
duckdb_destroy_table_function(func.handle)
|
||||
end
|
||||
return func.handle = C_NULL
|
||||
end
|
||||
|
||||
function create_table_function(
|
||||
con::Connection,
|
||||
name::AbstractString,
|
||||
parameters::Vector{LogicalType},
|
||||
bind_func::Function,
|
||||
init_func::Function,
|
||||
main_func::Function,
|
||||
extra_data::Any = missing,
|
||||
projection_pushdown::Bool = false,
|
||||
init_local_func::Union{Missing, Function} = missing
|
||||
)
|
||||
if init_local_func === missing
|
||||
init_local_func = _empty_init_info
|
||||
end
|
||||
fun = TableFunction(
|
||||
name,
|
||||
parameters,
|
||||
bind_func,
|
||||
init_func,
|
||||
init_local_func,
|
||||
main_func,
|
||||
extra_data,
|
||||
projection_pushdown
|
||||
)
|
||||
if duckdb_register_table_function(con.handle, fun.handle) != DuckDBSuccess
|
||||
throw(QueryException(string("Failed to register table function \"", name, "\"")))
|
||||
end
|
||||
push!(con.db.functions, fun)
|
||||
return
|
||||
end
|
||||
|
||||
function create_table_function(
|
||||
con::Connection,
|
||||
name::AbstractString,
|
||||
parameters::Vector{DataType},
|
||||
bind_func::Function,
|
||||
init_func::Function,
|
||||
main_func::Function,
|
||||
extra_data::Any = missing,
|
||||
projection_pushdown::Bool = false,
|
||||
init_local_func::Union{Missing, Function} = missing
|
||||
)
|
||||
parameter_types::Vector{LogicalType} = Vector()
|
||||
for parameter_type in parameters
|
||||
push!(parameter_types, create_logical_type(parameter_type))
|
||||
end
|
||||
return create_table_function(
|
||||
con,
|
||||
name,
|
||||
parameter_types,
|
||||
bind_func,
|
||||
init_func,
|
||||
main_func,
|
||||
extra_data,
|
||||
projection_pushdown,
|
||||
init_local_func
|
||||
)
|
||||
end
|
||||
|
||||
function create_table_function(
|
||||
db::DB,
|
||||
name::AbstractString,
|
||||
parameters::Vector{LogicalType},
|
||||
bind_func::Function,
|
||||
init_func::Function,
|
||||
main_func::Function,
|
||||
extra_data::Any = missing,
|
||||
projection_pushdown::Bool = false,
|
||||
init_local_func::Union{Missing, Function} = missing
|
||||
)
|
||||
return create_table_function(
|
||||
db.main_connection,
|
||||
name,
|
||||
parameters,
|
||||
bind_func,
|
||||
init_func,
|
||||
main_func,
|
||||
extra_data,
|
||||
projection_pushdown,
|
||||
init_local_func
|
||||
)
|
||||
end
|
||||
|
||||
function create_table_function(
|
||||
db::DB,
|
||||
name::AbstractString,
|
||||
parameters::Vector{DataType},
|
||||
bind_func::Function,
|
||||
init_func::Function,
|
||||
main_func::Function,
|
||||
extra_data::Any = missing,
|
||||
projection_pushdown::Bool = false,
|
||||
init_local_func::Union{Missing, Function} = missing
|
||||
)
|
||||
return create_table_function(
|
||||
db.main_connection,
|
||||
name,
|
||||
parameters,
|
||||
bind_func,
|
||||
init_func,
|
||||
main_func,
|
||||
extra_data,
|
||||
projection_pushdown,
|
||||
init_local_func
|
||||
)
|
||||
end
|
||||
236
external/duckdb/tools/juliapkg/src/table_scan.jl
vendored
Normal file
236
external/duckdb/tools/juliapkg/src/table_scan.jl
vendored
Normal file
@@ -0,0 +1,236 @@
|
||||
struct TableBindInfo
|
||||
tbl::Any
|
||||
input_columns::Vector
|
||||
scan_types::Vector{Type}
|
||||
result_types::Vector{Type}
|
||||
scan_functions::Vector{Function}
|
||||
|
||||
function TableBindInfo(
|
||||
tbl,
|
||||
input_columns::Vector,
|
||||
scan_types::Vector{Type},
|
||||
result_types::Vector{Type},
|
||||
scan_functions::Vector{Function}
|
||||
)
|
||||
return new(tbl, input_columns, scan_types, result_types, scan_functions)
|
||||
end
|
||||
end
|
||||
|
||||
table_result_type(tbl, entry) = Core.Compiler.typesubtract(eltype(tbl[entry]), Missing, 1)
|
||||
|
||||
julia_to_duck_type(::Type{Date}) = Int32
|
||||
julia_to_duck_type(::Type{Time}) = Int64
|
||||
julia_to_duck_type(::Type{DateTime}) = Int64
|
||||
julia_to_duck_type(::Type{T}) where {T} = T
|
||||
|
||||
value_to_duckdb(val::Date) = convert(Int32, Dates.date2epochdays(val) - ROUNDING_EPOCH_TO_UNIX_EPOCH_DAYS)
|
||||
value_to_duckdb(val::Time) = convert(Int64, Dates.value(val) / 1000)
|
||||
value_to_duckdb(val::DateTime) = convert(Int64, (Dates.datetime2epochms(val) - ROUNDING_EPOCH_TO_UNIX_EPOCH_MS) * 1000)
|
||||
value_to_duckdb(val::AbstractString) = throw(
|
||||
NotImplementedException(
|
||||
"Cannot use value_to_duckdb to convert string values - use DuckDB.assign_string_element on a vector instead"
|
||||
)
|
||||
)
|
||||
value_to_duckdb(val) = val
|
||||
|
||||
function tbl_scan_column(
|
||||
input_column::AbstractVector{JL_TYPE},
|
||||
row_offset::Int64,
|
||||
col_idx::Int64,
|
||||
result_idx::Int64,
|
||||
scan_count::Int64,
|
||||
output::DuckDB.DataChunk,
|
||||
::Type{DUCK_TYPE},
|
||||
::Type{JL_TYPE}
|
||||
) where {DUCK_TYPE, JL_TYPE}
|
||||
vector::Vec = DuckDB.get_vector(output, result_idx)
|
||||
result_array::Vector{DUCK_TYPE} = DuckDB.get_array(vector, DUCK_TYPE)
|
||||
validity::ValidityMask = DuckDB.get_validity(vector)
|
||||
for i::Int64 in 1:scan_count
|
||||
val = getindex(input_column, row_offset + i)
|
||||
if val === missing
|
||||
DuckDB.setinvalid(validity, i)
|
||||
else
|
||||
result_array[i] = value_to_duckdb(val)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
function tbl_scan_string_column(
|
||||
input_column::AbstractVector{JL_TYPE},
|
||||
row_offset::Int64,
|
||||
col_idx::Int64,
|
||||
result_idx::Int64,
|
||||
scan_count::Int64,
|
||||
output::DuckDB.DataChunk,
|
||||
::Type{DUCK_TYPE},
|
||||
::Type{JL_TYPE}
|
||||
) where {DUCK_TYPE, JL_TYPE}
|
||||
vector::Vec = DuckDB.get_vector(output, result_idx)
|
||||
validity::ValidityMask = DuckDB.get_validity(vector)
|
||||
for i::Int64 in 1:scan_count
|
||||
val = getindex(input_column, row_offset + i)
|
||||
if val === missing
|
||||
DuckDB.setinvalid(validity, i)
|
||||
else
|
||||
DuckDB.assign_string_element(vector, i, val)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
function tbl_scan_function(tbl, entry)
|
||||
result_type = table_result_type(tbl, entry)
|
||||
if result_type <: AbstractString
|
||||
return tbl_scan_string_column
|
||||
end
|
||||
return tbl_scan_column
|
||||
end
|
||||
|
||||
function tbl_bind_function(info::DuckDB.BindInfo)
|
||||
# fetch the tbl name from the function parameters
|
||||
parameter = DuckDB.get_parameter(info, 0)
|
||||
name = DuckDB.getvalue(parameter, String)
|
||||
# fetch the actual tbl using the function name
|
||||
extra_data = DuckDB.get_extra_data(info)
|
||||
tbl = extra_data[name]
|
||||
|
||||
# set the cardinality
|
||||
row_count::UInt64 = Tables.rowcount(tbl)
|
||||
DuckDB.set_stats_cardinality(info, row_count, true)
|
||||
|
||||
# register the result columns
|
||||
input_columns = Vector()
|
||||
scan_types::Vector{Type} = Vector()
|
||||
result_types::Vector{Type} = Vector()
|
||||
scan_functions::Vector{Function} = Vector()
|
||||
for entry in Tables.columnnames(tbl)
|
||||
result_type = table_result_type(tbl, entry)
|
||||
scan_function = tbl_scan_function(tbl, entry)
|
||||
push!(input_columns, tbl[entry])
|
||||
push!(scan_types, eltype(tbl[entry]))
|
||||
push!(result_types, julia_to_duck_type(result_type))
|
||||
push!(scan_functions, scan_function)
|
||||
|
||||
DuckDB.add_result_column(info, string(entry), result_type)
|
||||
end
|
||||
return TableBindInfo(tbl, input_columns, scan_types, result_types, scan_functions)
|
||||
end
|
||||
|
||||
mutable struct TableGlobalInfo
|
||||
pos::Int64
|
||||
global_lock::ReentrantLock
|
||||
|
||||
function TableGlobalInfo()
|
||||
return new(0, ReentrantLock())
|
||||
end
|
||||
end
|
||||
|
||||
mutable struct TableLocalInfo
|
||||
columns::Vector{Int64}
|
||||
current_pos::Int64
|
||||
end_pos::Int64
|
||||
|
||||
function TableLocalInfo(columns)
|
||||
return new(columns, 0, 0)
|
||||
end
|
||||
end
|
||||
|
||||
function tbl_global_init_function(info::DuckDB.InitInfo)
|
||||
bind_info = DuckDB.get_bind_info(info, TableBindInfo)
|
||||
# figure out the maximum number of threads to launch from the tbl size
|
||||
row_count::Int64 = Tables.rowcount(bind_info.tbl)
|
||||
max_threads::Int64 = ceil(row_count / DuckDB.ROW_GROUP_SIZE)
|
||||
DuckDB.set_max_threads(info, max_threads)
|
||||
return TableGlobalInfo()
|
||||
end
|
||||
|
||||
function tbl_local_init_function(info::DuckDB.InitInfo)
|
||||
columns = DuckDB.get_projected_columns(info)
|
||||
return TableLocalInfo(columns)
|
||||
end
|
||||
|
||||
function tbl_scan_function(info::DuckDB.FunctionInfo, output::DuckDB.DataChunk)
|
||||
bind_info = DuckDB.get_bind_info(info, TableBindInfo)
|
||||
global_info = DuckDB.get_init_info(info, TableGlobalInfo)
|
||||
local_info = DuckDB.get_local_info(info, TableLocalInfo)
|
||||
|
||||
if local_info.current_pos >= local_info.end_pos
|
||||
# ran out of data to scan in the local info: fetch new rows from the global state (if any)
|
||||
# we can in increments of 100 vectors
|
||||
lock(global_info.global_lock) do
|
||||
row_count::Int64 = Tables.rowcount(bind_info.tbl)
|
||||
local_info.current_pos = global_info.pos
|
||||
total_scan_amount::Int64 = DuckDB.ROW_GROUP_SIZE
|
||||
if local_info.current_pos + total_scan_amount >= row_count
|
||||
total_scan_amount = row_count - local_info.current_pos
|
||||
end
|
||||
local_info.end_pos = local_info.current_pos + total_scan_amount
|
||||
return global_info.pos += total_scan_amount
|
||||
end
|
||||
end
|
||||
scan_count::Int64 = DuckDB.VECTOR_SIZE
|
||||
current_row::Int64 = local_info.current_pos
|
||||
if current_row + scan_count >= local_info.end_pos
|
||||
scan_count = local_info.end_pos - current_row
|
||||
end
|
||||
local_info.current_pos += scan_count
|
||||
|
||||
result_idx::Int64 = 1
|
||||
for col_idx::Int64 in local_info.columns
|
||||
if col_idx == 0
|
||||
result_idx += 1
|
||||
continue
|
||||
end
|
||||
bind_info.scan_functions[col_idx](
|
||||
bind_info.input_columns[col_idx],
|
||||
current_row,
|
||||
col_idx,
|
||||
result_idx,
|
||||
scan_count,
|
||||
output,
|
||||
bind_info.result_types[col_idx],
|
||||
bind_info.scan_types[col_idx]
|
||||
)
|
||||
result_idx += 1
|
||||
end
|
||||
DuckDB.set_size(output, scan_count)
|
||||
return
|
||||
end
|
||||
|
||||
function register_table(con::Connection, tbl, name::AbstractString)
|
||||
con.db.registered_objects[name] = columntable(tbl)
|
||||
DBInterface.execute(
|
||||
con,
|
||||
string("CREATE OR REPLACE VIEW \"", name, "\" AS SELECT * FROM julia_tbl_scan('", name, "')")
|
||||
)
|
||||
return
|
||||
end
|
||||
register_table(db::DB, tbl, name::AbstractString) = register_table(db.main_connection, tbl, name)
|
||||
|
||||
function unregister_table(con::Connection, name::AbstractString)
|
||||
pop!(con.db.registered_objects, name)
|
||||
DBInterface.execute(con, string("DROP VIEW IF EXISTS \"", name, "\""))
|
||||
return
|
||||
end
|
||||
unregister_table(db::DB, name::AbstractString) = unregister_table(db.main_connection, name)
|
||||
|
||||
# for backwards compatibility:
|
||||
const register_data_frame = register_table
|
||||
const unregister_data_frame = unregister_table
|
||||
|
||||
|
||||
function _add_table_scan(db::DB)
|
||||
# add the table scan function
|
||||
DuckDB.create_table_function(
|
||||
db.main_connection,
|
||||
"julia_tbl_scan",
|
||||
[String],
|
||||
tbl_bind_function,
|
||||
tbl_global_init_function,
|
||||
tbl_scan_function,
|
||||
db.handle.registered_objects,
|
||||
true,
|
||||
tbl_local_init_function
|
||||
)
|
||||
return
|
||||
end
|
||||
48
external/duckdb/tools/juliapkg/src/transaction.jl
vendored
Normal file
48
external/duckdb/tools/juliapkg/src/transaction.jl
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
|
||||
function DBInterface.transaction(f, con::Connection)
|
||||
begin_transaction(con)
|
||||
try
|
||||
f()
|
||||
catch
|
||||
rollback(con)
|
||||
rethrow()
|
||||
end
|
||||
commit(con)
|
||||
return
|
||||
end
|
||||
|
||||
function DBInterface.transaction(f, db::DB)
|
||||
return DBInterface.transaction(f, db.main_connection)
|
||||
end
|
||||
|
||||
"""
|
||||
DuckDB.begin(db)
|
||||
|
||||
begin a transaction
|
||||
"""
|
||||
function begin_transaction end
|
||||
|
||||
begin_transaction(con::Connection) = execute(con, "BEGIN TRANSACTION;")
|
||||
begin_transaction(db::DB) = begin_transaction(db.main_connection)
|
||||
transaction(con::Connection) = begin_transaction(con)
|
||||
transaction(db::DB) = begin_transaction(db)
|
||||
|
||||
"""
|
||||
DuckDB.commit(db)
|
||||
|
||||
commit a transaction
|
||||
"""
|
||||
function commit end
|
||||
|
||||
commit(con::Connection) = execute(con, "COMMIT TRANSACTION;")
|
||||
commit(db::DB) = commit(db.main_connection)
|
||||
|
||||
"""
|
||||
DuckDB.rollback(db)
|
||||
|
||||
rollback transaction
|
||||
"""
|
||||
function rollback end
|
||||
|
||||
rollback(con::Connection) = execute(con, "ROLLBACK TRANSACTION;")
|
||||
rollback(db::DB) = rollback(db.main_connection)
|
||||
36
external/duckdb/tools/juliapkg/src/validity_mask.jl
vendored
Normal file
36
external/duckdb/tools/juliapkg/src/validity_mask.jl
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
"""
|
||||
DuckDB validity mask
|
||||
"""
|
||||
struct ValidityMask
|
||||
data::Vector{UInt64}
|
||||
|
||||
function ValidityMask(data::Vector{UInt64})
|
||||
result = new(data)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
const BITS_PER_VALUE = 64;
|
||||
|
||||
function get_entry_index(row_idx)
|
||||
return ((row_idx - 1) ÷ BITS_PER_VALUE) + 1
|
||||
end
|
||||
|
||||
function get_index_in_entry(row_idx)
|
||||
return (row_idx - 1) % BITS_PER_VALUE
|
||||
end
|
||||
|
||||
function setinvalid(mask::ValidityMask, index)
|
||||
entry_idx = get_entry_index(index)
|
||||
index_in_entry = get_index_in_entry(index)
|
||||
mask.data[entry_idx] &= ~(1 << index_in_entry)
|
||||
return
|
||||
end
|
||||
|
||||
function isvalid(mask::ValidityMask, index)::Bool
|
||||
entry_idx = get_entry_index(index)
|
||||
index_in_entry = get_index_in_entry(index)
|
||||
return (mask.data[entry_idx] & (1 << index_in_entry)) != 0
|
||||
end
|
||||
|
||||
all_valid(mask::ValidityMask) = all(==(typemax(eltype(mask.data))), mask.data)
|
||||
59
external/duckdb/tools/juliapkg/src/value.jl
vendored
Normal file
59
external/duckdb/tools/juliapkg/src/value.jl
vendored
Normal file
@@ -0,0 +1,59 @@
|
||||
"""
|
||||
DuckDB value
|
||||
"""
|
||||
mutable struct Value
|
||||
handle::duckdb_value
|
||||
|
||||
function Value(handle::duckdb_value)
|
||||
result = new(handle)
|
||||
finalizer(_destroy_value, result)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function _destroy_value(val::Value)
|
||||
if val.handle != C_NULL
|
||||
duckdb_destroy_value(val.handle)
|
||||
end
|
||||
val.handle = C_NULL
|
||||
return
|
||||
end
|
||||
|
||||
getvalue(val::Value, ::Type{T}) where {T <: Int64} = duckdb_get_int64(val.handle)
|
||||
function getvalue(val::Value, ::Type{T}) where {T <: String}
|
||||
ptr = duckdb_get_varchar(val.handle)
|
||||
result = unsafe_string(ptr)
|
||||
duckdb_free(ptr)
|
||||
return result
|
||||
end
|
||||
function getvalue(val::Value, ::Type{T}) where {T}
|
||||
throw(NotImplementedException("Unsupported type for getvalue"))
|
||||
end
|
||||
|
||||
create_value(val::T) where {T <: Bool} = Value(duckdb_create_bool(val))
|
||||
create_value(val::T) where {T <: Int8} = Value(duckdb_create_int8(val))
|
||||
create_value(val::T) where {T <: Int16} = Value(duckdb_create_int16(val))
|
||||
create_value(val::T) where {T <: Int32} = Value(duckdb_create_int32(val))
|
||||
create_value(val::T) where {T <: Int64} = Value(duckdb_create_int64(val))
|
||||
create_value(val::T) where {T <: Int128} = Value(duckdb_create_hugeint(val))
|
||||
create_value(val::T) where {T <: UInt8} = Value(duckdb_create_uint8(val))
|
||||
create_value(val::T) where {T <: UInt16} = Value(duckdb_create_uint16(val))
|
||||
create_value(val::T) where {T <: UInt32} = Value(duckdb_create_uint32(val))
|
||||
create_value(val::T) where {T <: UInt64} = Value(duckdb_create_uint64(val))
|
||||
create_value(val::T) where {T <: UInt128} = Value(duckdb_create_uhugeint(val))
|
||||
create_value(val::T) where {T <: Float32} = Value(duckdb_create_float(val))
|
||||
create_value(val::T) where {T <: Float64} = Value(duckdb_create_double(val))
|
||||
create_value(val::T) where {T <: Date} =
|
||||
Value(duckdb_create_date(Dates.date2epochdays(val) - ROUNDING_EPOCH_TO_UNIX_EPOCH_DAYS))
|
||||
create_value(val::T) where {T <: Time} = Value(duckdb_create_time(Dates.value(val) ÷ 1000))
|
||||
create_value(val::T) where {T <: DateTime} =
|
||||
Value(duckdb_create_timestamp((Dates.datetime2epochms(val) - ROUNDING_EPOCH_TO_UNIX_EPOCH_MS) * 1000))
|
||||
create_value(val::T) where {T <: AbstractString} = Value(duckdb_create_varchar_length(val, length(val)))
|
||||
function create_value(val::AbstractVector{T}) where {T}
|
||||
type = create_logical_type(T)
|
||||
values = create_value.(val)
|
||||
return Value(duckdb_create_list_value(type.handle, map(x -> x.handle, values), length(values)))
|
||||
end
|
||||
function create_value(val::T) where {T}
|
||||
throw(NotImplementedException("Unsupported type for getvalue"))
|
||||
end
|
||||
58
external/duckdb/tools/juliapkg/src/vector.jl
vendored
Normal file
58
external/duckdb/tools/juliapkg/src/vector.jl
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
"""
|
||||
DuckDB vector
|
||||
"""
|
||||
struct Vec
|
||||
handle::duckdb_vector
|
||||
|
||||
function Vec(handle::duckdb_vector)
|
||||
result = new(handle)
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
function get_array(vector::Vec, ::Type{T}, size = VECTOR_SIZE)::Vector{T} where {T}
|
||||
raw_ptr = duckdb_vector_get_data(vector.handle)
|
||||
ptr = Base.unsafe_convert(Ptr{T}, raw_ptr)
|
||||
return unsafe_wrap(Vector{T}, ptr, size, own = false)
|
||||
end
|
||||
|
||||
function get_validity(vector::Vec, size = VECTOR_SIZE)::ValidityMask
|
||||
duckdb_vector_ensure_validity_writable(vector.handle)
|
||||
validity_ptr = duckdb_vector_get_validity(vector.handle)
|
||||
ptr = Base.unsafe_convert(Ptr{UInt64}, validity_ptr)
|
||||
size_words = div(size, BITS_PER_VALUE, RoundUp)
|
||||
validity_vector = unsafe_wrap(Vector{UInt64}, ptr, size_words, own = false)
|
||||
return ValidityMask(validity_vector)
|
||||
end
|
||||
|
||||
function all_valid(vector::Vec, size = VECTOR_SIZE)::Bool
|
||||
validity_ptr = duckdb_vector_get_validity(vector.handle)
|
||||
validity_ptr == C_NULL && return true
|
||||
size_words = div(size, BITS_PER_VALUE, RoundUp)
|
||||
validity_vector = unsafe_wrap(Vector{UInt64}, validity_ptr, size_words, own = false)
|
||||
return all_valid(ValidityMask(validity_vector))
|
||||
end
|
||||
|
||||
function list_child(vector::Vec)::Vec
|
||||
return Vec(duckdb_list_vector_get_child(vector.handle))
|
||||
end
|
||||
|
||||
function list_size(vector::Vec)::UInt64
|
||||
return duckdb_list_vector_get_size(vector.handle)
|
||||
end
|
||||
|
||||
function struct_child(vector::Vec, index::UInt64)::Vec
|
||||
return Vec(duckdb_struct_vector_get_child(vector.handle, index))
|
||||
end
|
||||
|
||||
function union_member(vector::Vec, index::UInt64)::Vec
|
||||
return Vec(duckdb_union_vector_get_member(vector.handle, index))
|
||||
end
|
||||
|
||||
function assign_string_element(vector::Vec, index::Int64, str::String)
|
||||
return duckdb_vector_assign_string_element_len(vector.handle, index, str, sizeof(str))
|
||||
end
|
||||
|
||||
function assign_string_element(vector::Vec, index::Int64, str::AbstractString)
|
||||
return duckdb_vector_assign_string_element_len(vector.handle, index, str, sizeof(str))
|
||||
end
|
||||
9
external/duckdb/tools/juliapkg/test.sh
vendored
Executable file
9
external/duckdb/tools/juliapkg/test.sh
vendored
Executable file
@@ -0,0 +1,9 @@
|
||||
set -e
|
||||
|
||||
|
||||
export JULIA_DUCKDB_LIBRARY="`pwd`/../../build/debug/src/libduckdb.dylib"
|
||||
#export JULIA_DUCKDB_LIBRARY="`pwd`/../../build/release/src/libduckdb.dylib"
|
||||
|
||||
# memory profiling: --track-allocation=user
|
||||
export JULIA_NUM_THREADS=1
|
||||
julia --project -e "import Pkg; Pkg.test(; test_args = [\"$1\"])"
|
||||
BIN
external/duckdb/tools/juliapkg/test/resources/types_list.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/test/resources/types_list.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/test/resources/types_map.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/test/resources/types_map.parquet
vendored
Normal file
Binary file not shown.
BIN
external/duckdb/tools/juliapkg/test/resources/types_nested.parquet
vendored
Normal file
BIN
external/duckdb/tools/juliapkg/test/resources/types_nested.parquet
vendored
Normal file
Binary file not shown.
45
external/duckdb/tools/juliapkg/test/runtests.jl
vendored
Normal file
45
external/duckdb/tools/juliapkg/test/runtests.jl
vendored
Normal file
@@ -0,0 +1,45 @@
|
||||
using DataFrames
|
||||
using Tables
|
||||
using DuckDB
|
||||
using Test
|
||||
using Dates
|
||||
using FixedPointDecimals
|
||||
using UUIDs
|
||||
|
||||
test_files = [
|
||||
"test_appender.jl",
|
||||
"test_basic_queries.jl",
|
||||
"test_big_nested.jl",
|
||||
"test_config.jl",
|
||||
"test_connection.jl",
|
||||
"test_tbl_scan.jl",
|
||||
"test_prepare.jl",
|
||||
"test_transaction.jl",
|
||||
"test_sqlite.jl",
|
||||
"test_replacement_scan.jl",
|
||||
"test_table_function.jl",
|
||||
"test_old_interface.jl",
|
||||
"test_all_types.jl",
|
||||
"test_union_type.jl",
|
||||
"test_decimals.jl",
|
||||
"test_threading.jl",
|
||||
"test_tpch.jl",
|
||||
"test_tpch_multithread.jl",
|
||||
"test_stream_data_chunk.jl",
|
||||
"test_scalar_udf.jl"
|
||||
]
|
||||
|
||||
if length(ARGS) > 0 && !isempty(ARGS[1])
|
||||
filtered_test_files = []
|
||||
for test_file in test_files
|
||||
if test_file == ARGS[1]
|
||||
push!(filtered_test_files, test_file)
|
||||
end
|
||||
end
|
||||
test_files = filtered_test_files
|
||||
end
|
||||
|
||||
for fname in test_files
|
||||
println(fname)
|
||||
include(fname)
|
||||
end
|
||||
190
external/duckdb/tools/juliapkg/test/test_all_types.jl
vendored
Normal file
190
external/duckdb/tools/juliapkg/test/test_all_types.jl
vendored
Normal file
@@ -0,0 +1,190 @@
|
||||
|
||||
# test_all_types.jl
|
||||
|
||||
|
||||
@testset "Test All Types" begin
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
con = DBInterface.connect(db)
|
||||
|
||||
df = DataFrame(
|
||||
DBInterface.execute(
|
||||
con,
|
||||
"""SELECT * EXCLUDE(time, time_tz, fixed_int_array, fixed_varchar_array, fixed_nested_int_array,
|
||||
fixed_nested_varchar_array, fixed_struct_array, struct_of_fixed_array, fixed_array_of_int_list,
|
||||
list_of_fixed_int_array, bignum)
|
||||
, CASE WHEN time = '24:00:00'::TIME THEN '23:59:59.999999'::TIME ELSE time END AS time
|
||||
, CASE WHEN time_tz = '24:00:00-15:59:59'::TIMETZ THEN '23:59:59.999999-15:59:59'::TIMETZ ELSE time_tz END AS time_tz
|
||||
FROM test_all_types()
|
||||
"""
|
||||
)
|
||||
)
|
||||
#println(names(df))
|
||||
# we can also use 'propertynames()' to get the column names as symbols, that might make for a better testing approach
|
||||
# If we add a dictionary that maps from the symbol to the expected result
|
||||
|
||||
@test isequal(df.bool, [false, true, missing])
|
||||
@test isequal(df.tinyint, [-128, 127, missing])
|
||||
@test isequal(df.smallint, [-32768, 32767, missing])
|
||||
@test isequal(df.int, [-2147483648, 2147483647, missing])
|
||||
@test isequal(df.bigint, [-9223372036854775808, 9223372036854775807, missing])
|
||||
@test isequal(
|
||||
df.hugeint,
|
||||
[-170141183460469231731687303715884105728, 170141183460469231731687303715884105727, missing]
|
||||
)
|
||||
@test isequal(df.uhugeint, [0, 340282366920938463463374607431768211455, missing])
|
||||
@test isequal(df.utinyint, [0, 255, missing])
|
||||
@test isequal(df.usmallint, [0, 65535, missing])
|
||||
@test isequal(df.uint, [0, 4294967295, missing])
|
||||
@test isequal(df.ubigint, [0, 18446744073709551615, missing])
|
||||
@test isequal(df.float, [-3.4028235f38, 3.4028235f38, missing])
|
||||
@test isequal(df.double, [-1.7976931348623157e308, 1.7976931348623157e308, missing])
|
||||
@test isequal(df.dec_4_1, [-999.9, 999.9, missing])
|
||||
@test isequal(df.dec_9_4, [-99999.9999, 99999.9999, missing])
|
||||
@test isequal(df.dec_18_6, [-999999999999.999999, 999999999999.999999, missing])
|
||||
@test isequal(
|
||||
df.dec38_10,
|
||||
[-9999999999999999999999999999.9999999999, 9999999999999999999999999999.9999999999, missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.dec38_10,
|
||||
[-9999999999999999999999999999.9999999999, 9999999999999999999999999999.9999999999, missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.dec38_10,
|
||||
[-9999999999999999999999999999.9999999999, 9999999999999999999999999999.9999999999, missing]
|
||||
)
|
||||
@test isequal(df.small_enum, ["DUCK_DUCK_ENUM", "GOOSE", missing])
|
||||
@test isequal(df.medium_enum, ["enum_0", "enum_299", missing])
|
||||
@test isequal(df.large_enum, ["enum_0", "enum_69999", missing])
|
||||
@test isequal(df.date, [Dates.Date(-5877641, 6, 25), Dates.Date(5881580, 7, 10), missing])
|
||||
@test isequal(df.time, [Dates.Time(0, 0, 0), Dates.Time(23, 59, 59, 999, 999), missing])
|
||||
@test isequal(df.time_tz, [Dates.Time(0, 0, 0), Dates.Time(23, 59, 59, 999, 999), missing])
|
||||
@test isequal(
|
||||
df.timestamp,
|
||||
[Dates.DateTime(-290308, 12, 22, 0, 0, 0), Dates.DateTime(294247, 1, 10, 4, 0, 54, 775), missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.timestamp_tz,
|
||||
[Dates.DateTime(-290308, 12, 22, 0, 0, 0), Dates.DateTime(294247, 1, 10, 4, 0, 54, 775), missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.timestamp_s,
|
||||
[Dates.DateTime(-290308, 12, 22, 0, 0, 0), Dates.DateTime(294247, 1, 10, 4, 0, 54, 0), missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.timestamp_ms,
|
||||
[Dates.DateTime(-290308, 12, 22, 0, 0, 0), Dates.DateTime(294247, 1, 10, 4, 0, 54, 775), missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.timestamp_ns,
|
||||
[Dates.DateTime(1677, 9, 22, 0, 0, 0, 0), Dates.DateTime(2262, 4, 11, 23, 47, 16, 854), missing]
|
||||
)
|
||||
@test isequal(
|
||||
df.interval,
|
||||
[
|
||||
Dates.CompoundPeriod(Dates.Month(0), Dates.Day(0), Dates.Microsecond(0)),
|
||||
Dates.CompoundPeriod(Dates.Month(999), Dates.Day(999), Dates.Microsecond(999999999)),
|
||||
missing
|
||||
]
|
||||
)
|
||||
@test isequal(df.varchar, ["🦆🦆🦆🦆🦆🦆", "goo\0se", missing])
|
||||
@test isequal(
|
||||
df.blob,
|
||||
[
|
||||
UInt8[
|
||||
0x74,
|
||||
0x68,
|
||||
0x69,
|
||||
0x73,
|
||||
0x69,
|
||||
0x73,
|
||||
0x61,
|
||||
0x6c,
|
||||
0x6f,
|
||||
0x6e,
|
||||
0x67,
|
||||
0x62,
|
||||
0x6c,
|
||||
0x6f,
|
||||
0x62,
|
||||
0x00,
|
||||
0x77,
|
||||
0x69,
|
||||
0x74,
|
||||
0x68,
|
||||
0x6e,
|
||||
0x75,
|
||||
0x6c,
|
||||
0x6c,
|
||||
0x62,
|
||||
0x79,
|
||||
0x74,
|
||||
0x65,
|
||||
0x73
|
||||
],
|
||||
UInt8[0x00, 0x00, 0x00, 0x61],
|
||||
missing
|
||||
]
|
||||
)
|
||||
@test isequal(df.uuid, [UUID(0), UUID(UInt128(340282366920938463463374607431768211455)), missing])
|
||||
@test isequal(df.int_array, [[], [42, 999, missing, missing, -42], missing])
|
||||
@test isequal(df.double_array, [[], [42, NaN, Inf, -Inf, missing, -42], missing])
|
||||
@test isequal(
|
||||
df.date_array,
|
||||
[
|
||||
[],
|
||||
[
|
||||
Dates.Date(1970, 1, 1),
|
||||
Dates.Date(5881580, 7, 11),
|
||||
Dates.Date(-5877641, 6, 24),
|
||||
missing,
|
||||
Dates.Date(2022, 5, 12)
|
||||
],
|
||||
missing
|
||||
]
|
||||
)
|
||||
@test isequal(
|
||||
df.timestamp_array,
|
||||
[
|
||||
[],
|
||||
[
|
||||
Dates.DateTime(1970, 1, 1),
|
||||
Dates.DateTime(294247, 1, 10, 4, 0, 54, 775),
|
||||
Dates.DateTime(-290308, 12, 21, 19, 59, 5, 225),
|
||||
missing,
|
||||
Dates.DateTime(2022, 5, 12, 16, 23, 45)
|
||||
],
|
||||
missing
|
||||
]
|
||||
)
|
||||
@test isequal(
|
||||
df.timestamptz_array,
|
||||
[
|
||||
[],
|
||||
[
|
||||
Dates.DateTime(1970, 1, 1),
|
||||
Dates.DateTime(294247, 1, 10, 4, 0, 54, 775),
|
||||
Dates.DateTime(-290308, 12, 21, 19, 59, 5, 225),
|
||||
missing,
|
||||
Dates.DateTime(2022, 05, 12, 23, 23, 45)
|
||||
],
|
||||
missing
|
||||
]
|
||||
)
|
||||
@test isequal(df.varchar_array, [[], ["🦆🦆🦆🦆🦆🦆", "goose", missing, ""], missing])
|
||||
@test isequal(
|
||||
df.nested_int_array,
|
||||
[[], [[], [42, 999, missing, missing, -42], missing, [], [42, 999, missing, missing, -42]], missing]
|
||||
)
|
||||
@test isequal(df.struct, [(a = missing, b = missing), (a = 42, b = "🦆🦆🦆🦆🦆🦆"), missing])
|
||||
@test isequal(
|
||||
df.struct_of_arrays,
|
||||
[
|
||||
(a = missing, b = missing),
|
||||
(a = [42, 999, missing, missing, -42], b = ["🦆🦆🦆🦆🦆🦆", "goose", missing, ""]),
|
||||
missing
|
||||
]
|
||||
)
|
||||
@test isequal(df.array_of_structs, [[], [(a = missing, b = missing), (a = 42, b = "🦆🦆🦆🦆🦆🦆"), missing], missing])
|
||||
@test isequal(df.map, [Dict(), Dict("key1" => "🦆🦆🦆🦆🦆🦆", "key2" => "goose"), missing])
|
||||
end
|
||||
158
external/duckdb/tools/juliapkg/test/test_appender.jl
vendored
Normal file
158
external/duckdb/tools/juliapkg/test/test_appender.jl
vendored
Normal file
@@ -0,0 +1,158 @@
|
||||
|
||||
@testset "Appender Error" begin
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
con = DBInterface.connect(db)
|
||||
|
||||
@test_throws DuckDB.QueryException DuckDB.Appender(db, "nonexistanttable")
|
||||
@test_throws DuckDB.QueryException DuckDB.Appender(con, "t")
|
||||
end
|
||||
|
||||
@testset "Appender Usage - Schema $(schema_provided ? "Provided" : "Not Provided")" for schema_provided in (false, true)
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
table_name = "integers"
|
||||
if schema_provided
|
||||
schema_name = "test"
|
||||
full_table_name = "$(schema_name).$(table_name)"
|
||||
DBInterface.execute(db, "CREATE SCHEMA $(schema_name)")
|
||||
else
|
||||
schema_name = nothing
|
||||
full_table_name = table_name
|
||||
end
|
||||
|
||||
DBInterface.execute(db, "CREATE TABLE $(full_table_name)(i INTEGER)")
|
||||
|
||||
appender = DuckDB.Appender(db, table_name, schema_name)
|
||||
DuckDB.close(appender)
|
||||
DuckDB.close(appender)
|
||||
|
||||
# close!
|
||||
appender = DuckDB.Appender(db, table_name, schema_name)
|
||||
DBInterface.close!(appender)
|
||||
|
||||
appender = DuckDB.Appender(db, table_name, schema_name)
|
||||
for i in 0:9
|
||||
DuckDB.append(appender, i)
|
||||
DuckDB.end_row(appender)
|
||||
end
|
||||
DuckDB.flush(appender)
|
||||
DuckDB.close(appender)
|
||||
|
||||
results = DBInterface.execute(db, "SELECT * FROM $(full_table_name)")
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["i"]
|
||||
@test size(df, 1) == 10
|
||||
@test df.i == [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
|
||||
# close the database
|
||||
DuckDB.close(appender)
|
||||
end
|
||||
|
||||
@testset "Appender API" begin
|
||||
# Open the database
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
uuid = Base.UUID("a36a5689-48ec-4104-b147-9fed600d8250")
|
||||
|
||||
# Test data for the appender api test
|
||||
# - `col_name`: DuckDB column name
|
||||
# - `duck_type`: DuckDB column type
|
||||
# - `append_value`: Value to insert via DuckDB.append
|
||||
# - `ref_value`: (optional) Expected value from querying the DuckDB table. If not provided, uses `append_value`
|
||||
test_data = [
|
||||
(; col_name = :bool, duck_type = "BOOLEAN", append_value = true, ref_value = true),
|
||||
(; col_name = :tint, duck_type = "TINYINT", append_value = -1, ref_value = Int8(-1)),
|
||||
(; col_name = :sint, duck_type = "SMALLINT", append_value = -2, ref_value = Int16(-2)),
|
||||
(; col_name = :int, duck_type = "INTEGER", append_value = -3, ref_value = Int32(-3)),
|
||||
(; col_name = :bint, duck_type = "BIGINT", append_value = -4, ref_value = Int64(-4)),
|
||||
(; col_name = :hint, duck_type = "HUGEINT", append_value = Int128(-5), ref_value = Int128(-5)),
|
||||
(; col_name = :utint, duck_type = "UTINYINT", append_value = 1, ref_value = UInt8(1)),
|
||||
(; col_name = :usint, duck_type = "USMALLINT", append_value = 2, ref_value = UInt16(2)),
|
||||
(; col_name = :uint, duck_type = "UINTEGER", append_value = 3, ref_value = UInt32(3)),
|
||||
(; col_name = :ubint, duck_type = "UBIGINT", append_value = 4, ref_value = UInt64(4)),
|
||||
(; col_name = :uhint, duck_type = "UHUGEINT", append_value = UInt128(5), ref_value = UInt128(5)),
|
||||
(; col_name = :dec16, duck_type = "DECIMAL(4,2)", append_value = FixedDecimal{Int16, 2}(1.01)),
|
||||
(; col_name = :dec32, duck_type = "DECIMAL(9,2)", append_value = FixedDecimal{Int32, 2}(1.02)),
|
||||
(; col_name = :dec64, duck_type = "DECIMAL(18,2)", append_value = FixedDecimal{Int64, 2}(1.03)),
|
||||
(; col_name = :dec128, duck_type = "DECIMAL(38,2)", append_value = FixedDecimal{Int128, 2}(1.04)),
|
||||
(; col_name = :float, duck_type = "FLOAT", append_value = 1.0, ref_value = Float32(1.0)),
|
||||
(; col_name = :double, duck_type = "DOUBLE", append_value = 2.0, ref_value = Float64(2.0)),
|
||||
(; col_name = :date, duck_type = "DATE", append_value = Dates.Date("1970-04-11")),
|
||||
(; col_name = :time, duck_type = "TIME", append_value = Dates.Time(0, 0, 0, 0, 200)),
|
||||
(; col_name = :timestamp, duck_type = "TIMESTAMP", append_value = Dates.DateTime("1970-01-02T01:23:45.678")),
|
||||
(; col_name = :missingval, duck_type = "INTEGER", append_value = missing),
|
||||
(; col_name = :nothingval, duck_type = "INTEGER", append_value = nothing, ref_value = missing),
|
||||
(; col_name = :largeval, duck_type = "INTEGER", append_value = Int32(2^16)),
|
||||
(; col_name = :uuid, duck_type = "UUID", append_value = uuid),
|
||||
(; col_name = :varchar, duck_type = "VARCHAR", append_value = "Foo"),
|
||||
# lists
|
||||
(; col_name = :list_bool, duck_type = "BOOLEAN[]", append_value = Vector{Bool}([true, false, true])),
|
||||
(; col_name = :list_int8, duck_type = "TINYINT[]", append_value = Vector{Int8}([1, -2, 3])),
|
||||
(; col_name = :list_int16, duck_type = "SMALLINT[]", append_value = Vector{Int16}([1, -2, 3])),
|
||||
(; col_name = :list_int32, duck_type = "INTEGER[]", append_value = Vector{Int32}([1, -2, 3])),
|
||||
(; col_name = :list_int64, duck_type = "BIGINT[]", append_value = Vector{Int64}([1, -2, 3])),
|
||||
(;
|
||||
col_name = :list_int128,
|
||||
duck_type = "HUGEINT[]",
|
||||
append_value = Vector{Int128}([Int128(1), Int128(-2), Int128(3)])
|
||||
),
|
||||
# (; col_name = :list_uint8, duck_type = "UTINYINT[]", append_value = Vector{UInt8}([1, 2, 3])),
|
||||
(; col_name = :list_uint16, duck_type = "USMALLINT[]", append_value = Vector{UInt16}([1, 2, 3])),
|
||||
(; col_name = :list_uint32, duck_type = "UINTEGER[]", append_value = Vector{UInt32}([1, 2, 3])),
|
||||
(; col_name = :list_uint64, duck_type = "UBIGINT[]", append_value = Vector{UInt64}([1, 2, 3])),
|
||||
(;
|
||||
col_name = :list_uint128,
|
||||
duck_type = "UHUGEINT[]",
|
||||
append_value = Vector{UInt128}([UInt128(1), UInt128(2), UInt128(3)])
|
||||
),
|
||||
(; col_name = :list_float, duck_type = "FLOAT[]", append_value = Vector{Float32}([1.0, 2.0, 3.0])),
|
||||
(; col_name = :list_double, duck_type = "DOUBLE[]", append_value = Vector{Float64}([1.0, 2.0, 3.0])),
|
||||
(; col_name = :list_string, duck_type = "VARCHAR[]", append_value = Vector{String}(["a", "bb", "ccc"])),
|
||||
(;
|
||||
col_name = :list_date,
|
||||
duck_type = "DATE[]",
|
||||
append_value = Vector{Dates.Date}([
|
||||
Dates.Date("1970-01-01"),
|
||||
Dates.Date("1970-01-02"),
|
||||
Dates.Date("1970-01-03")
|
||||
])
|
||||
),
|
||||
(;
|
||||
col_name = :list_time,
|
||||
duck_type = "TIME[]",
|
||||
append_value = Vector{Dates.Time}([Dates.Time(1), Dates.Time(1, 2), Dates.Time(1, 2, 3)])
|
||||
),
|
||||
(;
|
||||
col_name = :list_timestamp,
|
||||
duck_type = "TIMESTAMP[]",
|
||||
append_value = Vector{Dates.DateTime}([
|
||||
Dates.DateTime("1970-01-01T00:00:00"),
|
||||
Dates.DateTime("1970-01-02T00:00:00"),
|
||||
Dates.DateTime("1970-01-03T00:00:00")
|
||||
])
|
||||
)
|
||||
]
|
||||
|
||||
sql = """CREATE TABLE dtypes(
|
||||
$(join(("$(row.col_name) $(row.duck_type)" for row in test_data), ",\n"))
|
||||
)"""
|
||||
DuckDB.execute(db, sql)
|
||||
appender = DuckDB.Appender(db, "dtypes")
|
||||
for row in test_data
|
||||
DuckDB.append(appender, row.append_value)
|
||||
end
|
||||
# End the row of the appender
|
||||
DuckDB.end_row(appender)
|
||||
# Destroy the appender and flush the data
|
||||
DuckDB.flush(appender)
|
||||
DuckDB.close(appender)
|
||||
|
||||
results = DBInterface.execute(db, "select * from dtypes;")
|
||||
df = DataFrame(results)
|
||||
for row in test_data
|
||||
ref_value = get(row, :ref_value, row.append_value)
|
||||
@test isequal(df[!, row.col_name], [ref_value])
|
||||
end
|
||||
|
||||
# close the database
|
||||
DBInterface.close!(db)
|
||||
end
|
||||
181
external/duckdb/tools/juliapkg/test/test_basic_queries.jl
vendored
Normal file
181
external/duckdb/tools/juliapkg/test/test_basic_queries.jl
vendored
Normal file
@@ -0,0 +1,181 @@
|
||||
# test_basic_queries.jl
|
||||
|
||||
using Tables: partitions
|
||||
|
||||
@testset "Test DBInterface.execute" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
results = DBInterface.execute(con, "SELECT 42 a")
|
||||
|
||||
# iterator
|
||||
for row in Tables.rows(results)
|
||||
@test row.a == 42
|
||||
@test row[1] == 42
|
||||
end
|
||||
|
||||
# convert to DataFrame
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["a"]
|
||||
@test size(df, 1) == 1
|
||||
@test df.a == [42]
|
||||
|
||||
# do block syntax to automatically close cursor
|
||||
df = DBInterface.execute(con, "SELECT 42 a") do results
|
||||
return DataFrame(results)
|
||||
end
|
||||
@test names(df) == ["a"]
|
||||
@test size(df, 1) == 1
|
||||
@test df.a == [42]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test numeric data types" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
results = DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
SELECT 42::TINYINT a, 42::INT16 b, 42::INT32 c, 42::INT64 d, 42::UINT8 e, 42::UINT16 f, 42::UINT32 g, 42::UINT64 h
|
||||
UNION ALL
|
||||
SELECT NULL, NULL, NULL, NULL, NULL, NULL, 43, NULL
|
||||
"""
|
||||
)
|
||||
|
||||
df = DataFrame(results)
|
||||
|
||||
@test size(df, 1) == 2
|
||||
@test isequal(df.a, [42, missing])
|
||||
@test isequal(df.b, [42, missing])
|
||||
@test isequal(df.c, [42, missing])
|
||||
@test isequal(df.d, [42, missing])
|
||||
@test isequal(df.e, [42, missing])
|
||||
@test isequal(df.f, [42, missing])
|
||||
@test isequal(df.g::Vector{Int}, [42, 43])
|
||||
@test isequal(df.h, [42, missing])
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test strings" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
results = DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
SELECT 'hello world' s
|
||||
UNION ALL
|
||||
SELECT NULL
|
||||
UNION ALL
|
||||
SELECT 'this is a long string'
|
||||
UNION ALL
|
||||
SELECT 'obligatory mühleisen'
|
||||
UNION ALL
|
||||
SELECT '🦆🍞🦆'
|
||||
"""
|
||||
)
|
||||
|
||||
df = DataFrame(results)
|
||||
@test size(df, 1) == 5
|
||||
@test isequal(df.s, ["hello world", missing, "this is a long string", "obligatory mühleisen", "🦆🍞🦆"])
|
||||
|
||||
for s in ["foo", "🦆DB", SubString("foobar", 1, 3), SubString("🦆ling", 1, 6)]
|
||||
results = DBInterface.execute(con, "SELECT length(?) as len", [s])
|
||||
@test only(results).len == 3
|
||||
end
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "DBInterface.execute - parser error" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# parser error
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELEC")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "DBInterface.execute - binder error" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# binder error
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM this_table_does_not_exist")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "DBInterface.execute - runtime error" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
res = DBInterface.execute(con, "select current_setting('threads')")
|
||||
df = DataFrame(res)
|
||||
print(df)
|
||||
|
||||
# run-time error
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(
|
||||
con,
|
||||
"SELECT i::int FROM (SELECT '42' UNION ALL SELECT 'hello') tbl(i)"
|
||||
)
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
# test a PIVOT query that generates multiple prepared statements and will fail with execute
|
||||
@testset "Test DBInterface.query" begin
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
DuckDB.execute(con, "CREATE TABLE Cities (Country VARCHAR, Name VARCHAR, Year INT, Population INT);")
|
||||
DuckDB.execute(con, "INSERT INTO Cities VALUES ('NL', 'Amsterdam', 2000, 1005)")
|
||||
DuckDB.execute(con, "INSERT INTO Cities VALUES ('NL', 'Amsterdam', 2010, 1065)")
|
||||
results = DuckDB.query(con, "PIVOT Cities ON Year USING first(Population);")
|
||||
|
||||
# iterator
|
||||
for row in Tables.rows(results)
|
||||
@test row[:Name] == "Amsterdam"
|
||||
@test row[4] == 1065
|
||||
end
|
||||
|
||||
# convert to DataFrame
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["Country", "Name", "2000", "2010"]
|
||||
@test size(df, 1) == 1
|
||||
@test df[1, :Country] == "NL"
|
||||
@test df[1, :Name] == "Amsterdam"
|
||||
@test df[1, "2000"] == 1005
|
||||
@test df[1, 4] == 1065
|
||||
|
||||
@test DataFrame(DuckDB.query(db, "select 'a'; select 2;"))[1, 1] == "a"
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test chunked response" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
DBInterface.execute(con, "CREATE TABLE chunked_table AS SELECT * FROM range(2049)")
|
||||
result = DBInterface.execute(con, "SELECT * FROM chunked_table;")
|
||||
chunks_it = partitions(result)
|
||||
chunks = collect(chunks_it)
|
||||
@test length(chunks) == 2
|
||||
@test_throws DuckDB.NotImplementedException collect(chunks_it)
|
||||
|
||||
result = DBInterface.execute(con, "SELECT * FROM chunked_table;", DuckDB.StreamResult)
|
||||
chunks_it = partitions(result)
|
||||
chunks = collect(chunks_it)
|
||||
@test length(chunks) == 2
|
||||
@test_throws DuckDB.NotImplementedException collect(chunks_it)
|
||||
|
||||
DuckDB.execute(
|
||||
con,
|
||||
"""
|
||||
CREATE TABLE large (x1 INT, x2 INT, x3 INT, x4 INT, x5 INT, x6 INT, x7 INT, x8 INT, x9 INT, x10 INT, x11 INT);
|
||||
"""
|
||||
)
|
||||
DuckDB.execute(con, "INSERT INTO large VALUES (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1);")
|
||||
result = DBInterface.execute(con, "SELECT * FROM large ;")
|
||||
chunks_it = partitions(result)
|
||||
chunks = collect(chunks_it)
|
||||
@test length(chunks) == 1
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
74
external/duckdb/tools/juliapkg/test/test_big_nested.jl
vendored
Normal file
74
external/duckdb/tools/juliapkg/test/test_big_nested.jl
vendored
Normal file
@@ -0,0 +1,74 @@
|
||||
|
||||
@testset "Test big list" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE list_table (int_list INT[]);")
|
||||
DBInterface.execute(con, "INSERT INTO list_table VALUES (range(2049));")
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM list_table;"))
|
||||
@test length(df[1, :int_list]) == 2049
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test big bitstring" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE bit_table (bits BIT);")
|
||||
# 131073 = 64 * 2048 + 1
|
||||
DBInterface.execute(con, "INSERT INTO bit_table VALUES (bitstring('1010', 131073));")
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM bit_table;"))
|
||||
# Currently mapped to Julia in an odd way.
|
||||
# Can reenable following https://github.com/duckdb/duckdb/issues/7065
|
||||
@test length(df[1, :bits]) == 131073 skip = true
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test big string" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE str_table (str VARCHAR);")
|
||||
DBInterface.execute(con, "INSERT INTO str_table VALUES (repeat('🦆', 1024) || '🪿');")
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM str_table;"))
|
||||
@test length(df[1, :str]) == 1025
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test big map" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE map_table (map MAP(VARCHAR, INT));")
|
||||
DBInterface.execute(
|
||||
con,
|
||||
"INSERT INTO map_table VALUES (map_from_entries([{'k': 'billy' || num, 'v': num} for num in range(2049)]));"
|
||||
)
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM map_table;"))
|
||||
@test length(df[1, :map]) == 2049
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test big struct" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE struct_table (stct STRUCT(a INT[], b INT[]));")
|
||||
DBInterface.execute(con, "INSERT INTO struct_table VALUES ({'a': range(1024), 'b': range(1025)});")
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM struct_table;"))
|
||||
s = df[1, :stct]
|
||||
@test length(s.a) == 1024
|
||||
@test length(s.b) == 1025
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test big union" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE union_table (uni UNION(a INT[], b INT));")
|
||||
DBInterface.execute(con, "INSERT INTO union_table (uni) VALUES (union_value(a := range(2049))), (42);")
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM union_table;"))
|
||||
@test length(df[1, :uni]) == 2049
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
12
external/duckdb/tools/juliapkg/test/test_c_api.jl
vendored
Normal file
12
external/duckdb/tools/juliapkg/test/test_c_api.jl
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
|
||||
@testset "C API Type Checks" begin
|
||||
|
||||
# Check struct sizes.
|
||||
# Timestamp struct size mismatch, eventually structs are stored as pointers. This happens if they are declared as mutable structs.
|
||||
@test sizeof(DuckDB.duckdb_timestamp_struct) ==
|
||||
sizeof(DuckDB.duckdb_date_struct) + sizeof(DuckDB.duckdb_time_struct)
|
||||
|
||||
# Bot structs are equivalent and actually stored as a Union type in C.
|
||||
@test sizeof(DuckDB.duckdb_string_t) == sizeof(DuckDB.duckdb_string_t_ptr)
|
||||
|
||||
end
|
||||
96
external/duckdb/tools/juliapkg/test/test_config.jl
vendored
Normal file
96
external/duckdb/tools/juliapkg/test/test_config.jl
vendored
Normal file
@@ -0,0 +1,96 @@
|
||||
# test_config.jl
|
||||
|
||||
@testset "Test configuration parameters" begin
|
||||
# by default NULLs come first
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT 42 a UNION ALL SELECT NULL ORDER BY a")
|
||||
tbl = rowtable(results)
|
||||
@test isequal(tbl, [(a = 42,), (a = missing,)])
|
||||
|
||||
DBInterface.close!(con)
|
||||
|
||||
# if we add this configuration flag, nulls should come last
|
||||
config = DuckDB.Config()
|
||||
DuckDB.set_config(config, "default_null_order", "nulls_first")
|
||||
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:", config)
|
||||
|
||||
# NULL should come last now
|
||||
results = DBInterface.execute(con, "SELECT 42 a UNION ALL SELECT NULL ORDER BY a")
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["a"]
|
||||
@test size(df, 1) == 2
|
||||
@test isequal(df.a, [missing, 42])
|
||||
|
||||
DBInterface.close!(con)
|
||||
|
||||
DuckDB.set_config(config, "unrecognized option", "aaa")
|
||||
@test_throws DuckDB.ConnectionException con = DBInterface.connect(DuckDB.DB, ":memory:", config)
|
||||
|
||||
DBInterface.close!(config)
|
||||
DBInterface.close!(config)
|
||||
|
||||
# test different ways to create a config object, all should be equivalent
|
||||
conf1 = DuckDB.Config()
|
||||
DuckDB.set_config(conf1, "default_null_order", "nulls_first")
|
||||
|
||||
conf2 = DuckDB.Config()
|
||||
conf2["default_null_order"] = "nulls_first"
|
||||
|
||||
conf3 = DuckDB.Config(default_null_order = "nulls_first")
|
||||
conf4 = DuckDB.Config(["default_null_order" => "nulls_first"])
|
||||
|
||||
@testset for config in [conf1, conf2, conf3, conf4]
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:", config)
|
||||
|
||||
# NULL should come last now
|
||||
results = DBInterface.execute(con, "SELECT 42 a UNION ALL SELECT NULL ORDER BY a")
|
||||
tbl = rowtable(results)
|
||||
@test isequal(tbl, [(a = missing,), (a = 42,)])
|
||||
|
||||
DBInterface.close!(con)
|
||||
|
||||
DuckDB.set_config(config, "unrecognized option", "aaa")
|
||||
@test_throws DuckDB.ConnectionException con = DBInterface.connect(DuckDB.DB, ":memory:", config)
|
||||
|
||||
DBInterface.close!(config)
|
||||
DBInterface.close!(config)
|
||||
end
|
||||
|
||||
# config options can be specified directly in the call
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:"; config = ["default_null_order" => "nulls_first"])
|
||||
tbl = DBInterface.execute(con, "SELECT 42 a UNION ALL SELECT NULL ORDER BY a") |> rowtable
|
||||
@test isequal(tbl, [(a = missing,), (a = 42,)])
|
||||
close(con)
|
||||
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:"; config = (; default_null_order = "nulls_first"))
|
||||
tbl = DBInterface.execute(con, "SELECT 42 a UNION ALL SELECT NULL ORDER BY a") |> rowtable
|
||||
@test isequal(tbl, [(a = missing,), (a = 42,)])
|
||||
close(con)
|
||||
|
||||
# special handling of the readonly option
|
||||
file = tempname()
|
||||
con = DBInterface.connect(DuckDB.DB, file)
|
||||
DBInterface.execute(con, "CREATE TABLE t1(a INTEGER)")
|
||||
close(con)
|
||||
con = DBInterface.connect(DuckDB.DB, file; readonly = true)
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "CREATE TABLE t2(a INTEGER)")
|
||||
close(con)
|
||||
end
|
||||
|
||||
@testset "Test Set TimeZone" begin
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:")
|
||||
|
||||
DBInterface.execute(con, "SET TimeZone='UTC'")
|
||||
results = DBInterface.execute(con, "SELECT CURRENT_SETTING('TimeZone') AS tz")
|
||||
df = DataFrame(results)
|
||||
@test isequal(df[1, "tz"], "UTC")
|
||||
|
||||
DBInterface.execute(con, "SET TimeZone='America/Los_Angeles'")
|
||||
results = DBInterface.execute(con, "SELECT CURRENT_SETTING('TimeZone') AS tz")
|
||||
df = DataFrame(results)
|
||||
@test isequal(df[1, "tz"], "America/Los_Angeles")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
55
external/duckdb/tools/juliapkg/test/test_connection.jl
vendored
Normal file
55
external/duckdb/tools/juliapkg/test/test_connection.jl
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
# test_connection.jl
|
||||
|
||||
@testset "Test opening and closing an in-memory database" begin
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:")
|
||||
DBInterface.close!(con)
|
||||
# verify that double-closing does not cause any problems
|
||||
DBInterface.close!(con)
|
||||
DBInterface.close!(con)
|
||||
@test 1 == 1
|
||||
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:")
|
||||
@test isopen(con)
|
||||
close(con)
|
||||
@test !isopen(con)
|
||||
end
|
||||
|
||||
@testset "Test opening a bogus directory" begin
|
||||
@test_throws DuckDB.ConnectionException DBInterface.connect(DuckDB.DB, "/path/to/bogus/directory")
|
||||
end
|
||||
|
||||
|
||||
@testset "Test opening and closing an on-disk database" begin
|
||||
# This checks for an issue where the DB and the connection are
|
||||
# closed but the actual db is not (and subsequently cannot be opened
|
||||
# in a different process). To check this, we create a DB, write some
|
||||
# data to it, close the connection and check if the WAL file exists.
|
||||
#
|
||||
# Ideally, the WAL file should not exist, but Garbage Collection of Julia
|
||||
# may not have run yet, so open database handles may still exist, preventing
|
||||
# the database from being closed properly.
|
||||
|
||||
db_path = joinpath(mktempdir(), "duckdata.db")
|
||||
db_path_wal = db_path * ".wal"
|
||||
|
||||
function write_data(dbfile::String)
|
||||
db = DuckDB.DB(dbfile)
|
||||
conn = DBInterface.connect(db)
|
||||
DBInterface.execute(conn, "CREATE OR REPLACE TABLE test (a INTEGER, b INTEGER);")
|
||||
DBInterface.execute(conn, "INSERT INTO test VALUES (1, 2);")
|
||||
DBInterface.close!(conn)
|
||||
DuckDB.close_database(db)
|
||||
return true
|
||||
end
|
||||
write_data(db_path) # call the function
|
||||
@test isfile(db_path_wal) === false # WAL file should not exist
|
||||
|
||||
@test isfile(db_path) # check if the database file exists
|
||||
|
||||
# check if the database can be opened
|
||||
if haskey(ENV, "JULIA_DUCKDB_LIBRARY")
|
||||
duckdb_binary = joinpath(dirname(ENV["JULIA_DUCKDB_LIBRARY"]), "..", "duckdb")
|
||||
result = run(`$duckdb_binary $db_path -c "SELECT * FROM test LIMIT 1"`) # check if the database can be opened
|
||||
@test success(result)
|
||||
end
|
||||
end
|
||||
89
external/duckdb/tools/juliapkg/test/test_decimals.jl
vendored
Normal file
89
external/duckdb/tools/juliapkg/test/test_decimals.jl
vendored
Normal file
@@ -0,0 +1,89 @@
|
||||
# test_decimals.jl
|
||||
|
||||
|
||||
@testset "Test decimal support" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
results = DBInterface.execute(
|
||||
con,
|
||||
"SELECT 42.3::DECIMAL(4,1) a, 4923.3::DECIMAL(9,1) b, 421.423::DECIMAL(18,3) c, 129481294.3392::DECIMAL(38,4) d"
|
||||
)
|
||||
|
||||
# convert to DataFrame
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["a", "b", "c", "d"]
|
||||
@test size(df, 1) == 1
|
||||
@test df.a == [42.3]
|
||||
@test df.b == [4923.3]
|
||||
@test df.c == [421.423]
|
||||
@test df.d == [129481294.3392]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
# test returning decimals in a table function
|
||||
function my_bind_function(info::DuckDB.BindInfo)
|
||||
DuckDB.add_result_column(info, "a", FixedDecimal{Int16, 0})
|
||||
DuckDB.add_result_column(info, "b", FixedDecimal{Int32, 1})
|
||||
DuckDB.add_result_column(info, "c", FixedDecimal{Int64, 2})
|
||||
DuckDB.add_result_column(info, "d", FixedDecimal{Int128, 3})
|
||||
return missing
|
||||
end
|
||||
|
||||
mutable struct MyInitStruct
|
||||
pos::Int64
|
||||
|
||||
function MyInitStruct()
|
||||
return new(0)
|
||||
end
|
||||
end
|
||||
|
||||
function my_init_function(info::DuckDB.InitInfo)
|
||||
return MyInitStruct()
|
||||
end
|
||||
|
||||
function my_main_function(info::DuckDB.FunctionInfo, output::DuckDB.DataChunk)
|
||||
init_info = DuckDB.get_init_info(info, MyInitStruct)
|
||||
|
||||
a_array = DuckDB.get_array(output, 1, Int16)
|
||||
b_array = DuckDB.get_array(output, 2, Int32)
|
||||
c_array = DuckDB.get_array(output, 3, Int64)
|
||||
d_array = DuckDB.get_array(output, 4, Int128)
|
||||
count = 0
|
||||
multiplier = 1
|
||||
for i in 1:(DuckDB.VECTOR_SIZE)
|
||||
if init_info.pos >= 3
|
||||
break
|
||||
end
|
||||
a_array[count + 1] = 42 * multiplier
|
||||
b_array[count + 1] = 42 * multiplier
|
||||
c_array[count + 1] = 42 * multiplier
|
||||
d_array[count + 1] = 42 * multiplier
|
||||
count += 1
|
||||
init_info.pos += 1
|
||||
multiplier *= 10
|
||||
end
|
||||
|
||||
DuckDB.set_size(output, count)
|
||||
return
|
||||
end
|
||||
|
||||
@testset "Test returning decimals from a table functions" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
arguments::Vector{DataType} = Vector()
|
||||
DuckDB.create_table_function(con, "my_function", arguments, my_bind_function, my_init_function, my_main_function)
|
||||
GC.gc()
|
||||
|
||||
# 3 elements
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_function()")
|
||||
GC.gc()
|
||||
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["a", "b", "c", "d"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.a == [42, 420, 4200]
|
||||
@test df.b == [4.2, 42, 420]
|
||||
@test df.c == [0.42, 4.2, 42]
|
||||
@test df.d == [0.042, 0.42, 4.2]
|
||||
end
|
||||
190
external/duckdb/tools/juliapkg/test/test_old_interface.jl
vendored
Normal file
190
external/duckdb/tools/juliapkg/test/test_old_interface.jl
vendored
Normal file
@@ -0,0 +1,190 @@
|
||||
# test_old_interface.jl
|
||||
|
||||
@testset "DB Connection" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
@test isa(con, DuckDB.Connection)
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
@testset "Test append DataFrame" begin
|
||||
# Open the database
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
|
||||
# Create the table the data is appended to
|
||||
DuckDB.execute(
|
||||
con,
|
||||
"CREATE TABLE dtypes(bool BOOLEAN, tint TINYINT, sint SMALLINT, int INTEGER, bint BIGINT, utint UTINYINT, usint USMALLINT, uint UINTEGER, ubint UBIGINT, float FLOAT, double DOUBLE, date DATE, time TIME, vchar VARCHAR, nullval INTEGER)"
|
||||
)
|
||||
|
||||
# Create test DataFrame
|
||||
input_df = DataFrame(
|
||||
bool = [true, false],
|
||||
tint = Int8.(1:2),
|
||||
sint = Int16.(1:2),
|
||||
int = Int32.(1:2),
|
||||
bint = Int64.(1:2),
|
||||
utint = UInt8.(1:2),
|
||||
usint = UInt16.(1:2),
|
||||
uint = UInt32.(1:2),
|
||||
ubint = UInt64.(1:2),
|
||||
float = Float32.(1:2),
|
||||
double = Float64.(1:2),
|
||||
date = [Dates.Date("1970-04-11"), Dates.Date("1970-04-12")],
|
||||
time = [Dates.Time(0, 0, 0, 100, 0), Dates.Time(0, 0, 0, 200, 0)],
|
||||
vchar = ["Foo", "Bar"],
|
||||
nullval = [missing, Int32(2)]
|
||||
)
|
||||
|
||||
# append the DataFrame to the table
|
||||
DuckDB.appendDataFrame(input_df, con, "dtypes")
|
||||
|
||||
# Output the data from the table
|
||||
output_df = DataFrame(DuckDB.toDataFrame(con, "select * from dtypes;"))
|
||||
|
||||
# Compare each column of the input and output dataframe with each other
|
||||
for (col_pos, input_col) in enumerate(eachcol(input_df))
|
||||
@test isequal(input_col, output_df[:, col_pos])
|
||||
end
|
||||
|
||||
# Disconnect and close the database
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
@testset "Test README" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
res = DuckDB.execute(con, "CREATE TABLE integers(date DATE, jcol INTEGER)")
|
||||
res = DuckDB.execute(con, "INSERT INTO integers VALUES ('2021-09-27', 4), ('2021-09-28', 6), ('2021-09-29', 8)")
|
||||
res = DuckDB.execute(con, "SELECT * FROM integers")
|
||||
df = DataFrame(DuckDB.toDataFrame(res))
|
||||
@test isa(df, DataFrame)
|
||||
df = DataFrame(DuckDB.toDataFrame(con, "SELECT * FROM integers"))
|
||||
println(typeof(df))
|
||||
@test isa(df, DataFrame)
|
||||
DuckDB.appendDataFrame(df, con, "integers")
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
#
|
||||
@testset "HUGE Int test" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
res = DuckDB.execute(con, "CREATE TABLE huge(id INTEGER,data HUGEINT);")
|
||||
res = DuckDB.execute(con, "INSERT INTO huge VALUES (1,NULL), (2, 1761718171), (3, 171661889178);")
|
||||
res = DuckDB.toDataFrame(con, "SELECT * FROM huge")
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
@testset "Interval type" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
res = DuckDB.execute(con, "CREATE TABLE interval(interval INTERVAL);")
|
||||
res = DuckDB.execute(
|
||||
con,
|
||||
"""
|
||||
INSERT INTO interval VALUES
|
||||
(INTERVAL 5 HOUR),
|
||||
(INTERVAL 12 MONTH),
|
||||
(INTERVAL 12 MICROSECOND),
|
||||
(INTERVAL 1 YEAR);
|
||||
"""
|
||||
)
|
||||
res = DataFrame(DuckDB.toDataFrame(con, "SELECT * FROM interval;"))
|
||||
@test isa(res, DataFrame)
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
@testset "Timestamp" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
|
||||
# insert without timezone, display as UTC
|
||||
res = DuckDB.execute(con, "CREATE TABLE timestamp(timestamp TIMESTAMP , data INTEGER);")
|
||||
res = DuckDB.execute(
|
||||
con,
|
||||
"INSERT INTO timestamp VALUES ('2021-09-27 11:30:00.000', 4), ('2021-09-28 12:30:00.000', 6), ('2021-09-29 13:30:00.000', 8);"
|
||||
)
|
||||
res = DuckDB.execute(con, "SELECT * FROM timestamp WHERE timestamp='2021-09-27T11:30:00Z';")
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "timestamp"], DateTime(2021, 9, 27, 11, 30, 0))
|
||||
|
||||
# insert with timezone, display as UTC
|
||||
res = DuckDB.execute(con, "CREATE TABLE timestamp1(timestamp TIMESTAMP , data INTEGER);")
|
||||
res = DuckDB.execute(
|
||||
con,
|
||||
"INSERT INTO timestamp1 VALUES ('2021-09-27T10:30:00.000', 4), ('2021-09-28T11:30:00.000', 6), ('2021-09-29T12:30:00.000', 8);"
|
||||
)
|
||||
res = DuckDB.execute(con, "SELECT * FROM timestamp1 WHERE timestamp=?;", [DateTime(2021, 9, 27, 10, 30, 0)])
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "timestamp"], DateTime(2021, 9, 27, 10, 30, 0))
|
||||
|
||||
# query with local datetime, display as UTC
|
||||
res = DuckDB.execute(con, "SELECT * FROM timestamp1 WHERE timestamp='2021-09-27T10:30:00.000';")
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "timestamp"], DateTime(2021, 9, 27, 10, 30, 0))
|
||||
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
@testset "TimestampTZ" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
DuckDB.execute(con, "SET TimeZone='Asia/Shanghai'") # UTC+8
|
||||
|
||||
res = DuckDB.execute(con, "SELECT TIMESTAMPTZ '2021-09-27 11:30:00' tz, TIMESTAMP '2021-09-27 11:30:00' ts;")
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "tz"], DateTime(2021, 9, 27, 3, 30, 0))
|
||||
@test isequal(df[1, "ts"], DateTime(2021, 9, 27, 11, 30, 0))
|
||||
|
||||
res = DuckDB.execute(con, "CREATE TABLE timestamptz(timestamp TIMESTAMPTZ , data INTEGER);")
|
||||
res = DuckDB.execute(
|
||||
con,
|
||||
"INSERT INTO timestamptz VALUES ('2021-09-27 11:30:00.000', 4), ('2021-09-28 12:30:00.000', 6), ('2021-09-29 13:30:00.000', 8);"
|
||||
)
|
||||
res = DuckDB.execute(con, "SELECT * FROM timestamptz WHERE timestamp='2021-09-27 11:30:00'")
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "data"], 4)
|
||||
@test isequal(df[1, "timestamp"], DateTime(2021, 9, 27, 3, 30, 0))
|
||||
|
||||
res = DuckDB.execute(con, "SELECT * FROM timestamptz WHERE timestamp='2021-09-27T03:30:00Z'")
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "data"], 4)
|
||||
@test isequal(df[1, "timestamp"], DateTime(2021, 9, 27, 3, 30, 0))
|
||||
|
||||
res = DuckDB.execute(con, "SELECT * FROM timestamptz WHERE timestamp='2021-09-27T12:30:00+09'")
|
||||
df = DataFrame(res)
|
||||
@test isequal(df[1, "data"], 4)
|
||||
@test isequal(df[1, "timestamp"], DateTime(2021, 9, 27, 3, 30, 0))
|
||||
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
@testset "Items table" begin
|
||||
db = DuckDB.open(":memory:")
|
||||
con = DuckDB.connect(db)
|
||||
res = DuckDB.execute(con, "CREATE TABLE items(item VARCHAR, value DECIMAL(10,2), count INTEGER);")
|
||||
res = DuckDB.execute(con, "INSERT INTO items VALUES ('jeans', 20.0, 1), ('hammer', 42.2, 2);")
|
||||
res = DataFrame(DuckDB.toDataFrame(con, "SELECT * FROM items;"))
|
||||
@test isa(res, DataFrame)
|
||||
DuckDB.disconnect(con)
|
||||
end
|
||||
|
||||
@testset "Integers and dates table" begin
|
||||
db = DuckDB.DB()
|
||||
res = DBInterface.execute(db, "CREATE TABLE integers(date DATE, data INTEGER);")
|
||||
res =
|
||||
DBInterface.execute(db, "INSERT INTO integers VALUES ('2021-09-27', 4), ('2021-09-28', 6), ('2021-09-29', 8);")
|
||||
res = DBInterface.execute(db, "SELECT * FROM integers;")
|
||||
res = DataFrame(DuckDB.toDataFrame(res))
|
||||
@test res.date == [Date(2021, 9, 27), Date(2021, 9, 28), Date(2021, 9, 29)]
|
||||
@test isa(res, DataFrame)
|
||||
DBInterface.close!(db)
|
||||
end
|
||||
154
external/duckdb/tools/juliapkg/test/test_prepare.jl
vendored
Normal file
154
external/duckdb/tools/juliapkg/test/test_prepare.jl
vendored
Normal file
@@ -0,0 +1,154 @@
|
||||
# test_prepare.jl
|
||||
|
||||
@testset "Test DBInterface.prepare" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE test_table(i INTEGER, j DOUBLE)")
|
||||
stmt = DBInterface.prepare(con, "INSERT INTO test_table VALUES(?, ?)")
|
||||
|
||||
DBInterface.execute(stmt, [1, 3.5])
|
||||
DBInterface.execute(stmt, [missing, nothing])
|
||||
DBInterface.execute(stmt, [2, 0.5])
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM test_table")
|
||||
df = DataFrame(results)
|
||||
|
||||
@test isequal(df.i, [1, missing, 2])
|
||||
@test isequal(df.j, [3.5, missing, 0.5])
|
||||
|
||||
# execute many
|
||||
DBInterface.executemany(stmt, (col1 = [1, 2, 3, 4, 5], col2 = [1, 2, 4, 8, -0.5]))
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM test_table")
|
||||
df = DataFrame(results)
|
||||
|
||||
@test isequal(df.i, [1, missing, 2, 1, 2, 3, 4, 5])
|
||||
@test isequal(df.j, [3.5, missing, 0.5, 1, 2, 4, 8, -0.5])
|
||||
|
||||
# can bind vectors to parameters
|
||||
stmt = DBInterface.prepare(con, "FROM test_table WHERE i IN ?;")
|
||||
results = DBInterface.execute(stmt, ([1, 2],))
|
||||
df = DataFrame(results)
|
||||
|
||||
@test all(df.i .∈ Ref([1, 2]))
|
||||
|
||||
# verify that double-closing does not cause any problems
|
||||
DBInterface.close!(stmt)
|
||||
DBInterface.close!(stmt)
|
||||
DBInterface.close!(con)
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test DBInterface.prepare with various types" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
type_names = [
|
||||
"BOOLEAN",
|
||||
"TINYINT",
|
||||
"SMALLINT",
|
||||
"INTEGER",
|
||||
"BIGINT",
|
||||
"UTINYINT",
|
||||
"USMALLINT",
|
||||
"UINTEGER",
|
||||
"UBIGINT",
|
||||
"FLOAT",
|
||||
"DOUBLE",
|
||||
"DATE",
|
||||
"TIME",
|
||||
"TIMESTAMP",
|
||||
"VARCHAR",
|
||||
"INTEGER",
|
||||
"BLOB"
|
||||
]
|
||||
type_values = [
|
||||
Bool(true),
|
||||
Int8(3),
|
||||
Int16(4),
|
||||
Int32(8),
|
||||
Int64(20),
|
||||
UInt8(42),
|
||||
UInt16(300),
|
||||
UInt32(420421),
|
||||
UInt64(43294832),
|
||||
Float32(0.5),
|
||||
Float64(0.25),
|
||||
Date(1992, 9, 20),
|
||||
Time(23, 10, 33),
|
||||
DateTime(1992, 9, 20, 23, 10, 33),
|
||||
String("hello world"),
|
||||
missing,
|
||||
rand(UInt8, 100)
|
||||
]
|
||||
for i in 1:size(type_values, 1)
|
||||
stmt = DBInterface.prepare(con, string("SELECT ?::", type_names[i], " a"))
|
||||
result = DataFrame(DBInterface.execute(stmt, [type_values[i]]))
|
||||
@test isequal(result.a, [type_values[i]])
|
||||
end
|
||||
end
|
||||
|
||||
@testset "DBInterface.prepare: named parameters not supported yet" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE test_table(i INTEGER, j DOUBLE)")
|
||||
@test_throws DuckDB.QueryException DBInterface.prepare(con, "INSERT INTO test_table VALUES(:col1, :col2)")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "prepare: Named parameters" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
DBInterface.execute(con, "CREATE TABLE test_table(i INTEGER, j DOUBLE)")
|
||||
|
||||
# Check named syntax with Kwargs and Dict
|
||||
stmt = DBInterface.prepare(con, raw"INSERT INTO test_table VALUES($col1, $col2)")
|
||||
DBInterface.execute(stmt, Dict(["col1" => 1, "col2" => 3.5]))
|
||||
DBInterface.execute(stmt; col1 = 2, col2 = 4.5)
|
||||
results = DBInterface.execute(con, "SELECT * FROM test_table") |> DataFrame
|
||||
@test isequal(results.i, [1, 2])
|
||||
@test isequal(results.j, [3.5, 4.5])
|
||||
|
||||
|
||||
# Check positional syntax
|
||||
DBInterface.execute(con, "TRUNCATE TABLE test_table")
|
||||
stmt = DBInterface.prepare(con, raw"INSERT INTO test_table VALUES($2, $1)")
|
||||
DBInterface.execute(stmt, (3.5, 1))
|
||||
DBInterface.execute(stmt, (4.5, 2))
|
||||
results = DBInterface.execute(con, "SELECT * FROM test_table") |> DataFrame
|
||||
@test isequal(results.i, [1, 2])
|
||||
@test isequal(results.j, [3.5, 4.5])
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "DBInterface.prepare: execute many" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE test_table(i INTEGER, j DOUBLE)")
|
||||
@test_throws DuckDB.QueryException DBInterface.prepare(con, "INSERT INTO test_table VALUES(:col1, :col2)")
|
||||
stmt = DBInterface.prepare(con, raw"INSERT INTO test_table VALUES($col1, $col2)")
|
||||
col1 = [1, 2, 3, 4, 5]
|
||||
col2 = [1, 2, 4, 8, -0.5]
|
||||
DBInterface.executemany(stmt, (col1 = col1, col2 = col2))
|
||||
results = DBInterface.execute(con, "SELECT * FROM test_table") |> DataFrame
|
||||
|
||||
@test isequal(results.i, col1)
|
||||
@test isequal(results.j, col2)
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
|
||||
@testset "DBInterface.prepare: ambiguous parameters" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
stmt = DBInterface.prepare(con, "SELECT ? AS a")
|
||||
result = DataFrame(DBInterface.execute(stmt, [42]))
|
||||
@test isequal(result.a, [42])
|
||||
|
||||
result = DataFrame(DBInterface.execute(stmt, ["hello world"]))
|
||||
@test isequal(result.a, ["hello world"])
|
||||
|
||||
result = DataFrame(DBInterface.execute(stmt, [DateTime(1992, 9, 20, 23, 10, 33)]))
|
||||
@test isequal(result.a, [DateTime(1992, 9, 20, 23, 10, 33)])
|
||||
end
|
||||
72
external/duckdb/tools/juliapkg/test/test_replacement_scan.jl
vendored
Normal file
72
external/duckdb/tools/juliapkg/test/test_replacement_scan.jl
vendored
Normal file
@@ -0,0 +1,72 @@
|
||||
# test_replacement_scan.jl
|
||||
|
||||
function RangeReplacementScan(info)
|
||||
table_name = DuckDB.get_table_name(info)
|
||||
number = tryparse(Int64, table_name)
|
||||
if number === nothing
|
||||
return
|
||||
end
|
||||
DuckDB.set_function_name(info, "range")
|
||||
DuckDB.add_function_parameter(info, DuckDB.create_value(number))
|
||||
return
|
||||
end
|
||||
|
||||
@testset "Test replacement scans" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# add a replacement scan that turns any number provided as a table name into range(X)
|
||||
DuckDB.add_replacement_scan!(con, RangeReplacementScan, nothing)
|
||||
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM \"2\" tbl(a)"))
|
||||
@test df.a == [0, 1]
|
||||
|
||||
# this still fails
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM nonexistant")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
function RepeatReplacementScan(info)
|
||||
table_name = DuckDB.get_table_name(info)
|
||||
splits = split(table_name, "*")
|
||||
if size(splits, 1) != 2
|
||||
return
|
||||
end
|
||||
number = tryparse(Int64, splits[2])
|
||||
if number === nothing
|
||||
return
|
||||
end
|
||||
DuckDB.set_function_name(info, "repeat")
|
||||
DuckDB.add_function_parameter(info, DuckDB.create_value(splits[1]))
|
||||
DuckDB.add_function_parameter(info, DuckDB.create_value(number))
|
||||
return
|
||||
end
|
||||
|
||||
@testset "Test string replacement scans" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# add a replacement scan that turns any number provided as a table name into range(X)
|
||||
DuckDB.add_replacement_scan!(con, RepeatReplacementScan, nothing)
|
||||
|
||||
df = DataFrame(DBInterface.execute(con, "SELECT * FROM \"hello*2\" tbl(a)"))
|
||||
@test df.a == ["hello", "hello"]
|
||||
|
||||
# this still fails
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM nonexistant")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
function ErrorReplacementScan(info)
|
||||
throw("replacement scan eek")
|
||||
end
|
||||
|
||||
@testset "Test error replacement scans" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DuckDB.add_replacement_scan!(con, ErrorReplacementScan, nothing)
|
||||
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM nonexistant")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
436
external/duckdb/tools/juliapkg/test/test_scalar_udf.jl
vendored
Normal file
436
external/duckdb/tools/juliapkg/test/test_scalar_udf.jl
vendored
Normal file
@@ -0,0 +1,436 @@
|
||||
# test_scalar_udf.jl
|
||||
|
||||
# Define a simple scalar UDF that doubles the input value
|
||||
|
||||
function my_double_function(
|
||||
info::DuckDB.duckdb_function_info,
|
||||
input::DuckDB.duckdb_data_chunk,
|
||||
output::DuckDB.duckdb_vector
|
||||
)
|
||||
# Convert input data chunk to DataChunk object
|
||||
input_chunk = DuckDB.DataChunk(input, false)
|
||||
n = DuckDB.get_size(input_chunk)
|
||||
|
||||
# Get input vector (assuming one input parameter)
|
||||
input_vector = DuckDB.get_vector(input_chunk, 1)
|
||||
input_array = DuckDB.get_array(input_vector, Int64, n)
|
||||
|
||||
# Get output vector
|
||||
output_array = DuckDB.get_array(DuckDB.Vec(output), Int64, n)
|
||||
|
||||
# Perform the operation: double each input value
|
||||
for i in 1:n
|
||||
output_array[i] = input_array[i] * 2
|
||||
end
|
||||
end
|
||||
|
||||
# Define a scalar UDF that returns NULL for odd numbers and the number itself for even numbers
|
||||
function my_null_function(
|
||||
info::DuckDB.duckdb_function_info,
|
||||
input::DuckDB.duckdb_data_chunk,
|
||||
output::DuckDB.duckdb_vector
|
||||
)
|
||||
# Convert input data chunk to DataChunk object
|
||||
input_chunk = DuckDB.DataChunk(input, false)
|
||||
n = DuckDB.get_size(input_chunk)
|
||||
|
||||
# Get input vector
|
||||
input_vector = DuckDB.get_vector(input_chunk, 1)
|
||||
input_array = DuckDB.get_array(input_vector, Int64, n)
|
||||
validity_input = DuckDB.get_validity(input_vector)
|
||||
|
||||
# Get output vector
|
||||
output_vector = DuckDB.Vec(output)
|
||||
output_array = DuckDB.get_array(output_vector, Int64, n)
|
||||
validity_output = DuckDB.get_validity(output_vector)
|
||||
|
||||
# Perform the operation
|
||||
for i in 1:n
|
||||
if DuckDB.isvalid(validity_input, i)
|
||||
if input_array[i] % 2 == 0
|
||||
output_array[i] = input_array[i]
|
||||
# Validity is true by default, no need to set
|
||||
else
|
||||
# Set output as NULL
|
||||
DuckDB.setinvalid(validity_output, i)
|
||||
end
|
||||
else
|
||||
# Input is NULL, set output as NULL
|
||||
DuckDB.setinvalid(validity_output, i)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Define a scalar UDF that always throws an error
|
||||
function my_error_function(
|
||||
info::DuckDB.duckdb_function_info,
|
||||
input::DuckDB.duckdb_data_chunk,
|
||||
output::DuckDB.duckdb_vector
|
||||
)
|
||||
throw(ErrorException("Runtime error in scalar function"))
|
||||
end
|
||||
|
||||
|
||||
function my_string_function_count_a(
|
||||
info::DuckDB.duckdb_function_info,
|
||||
input::DuckDB.duckdb_data_chunk,
|
||||
output::DuckDB.duckdb_vector
|
||||
)
|
||||
|
||||
|
||||
|
||||
input_chunk = DuckDB.DataChunk(input, false)
|
||||
output_vec = DuckDB.Vec(output)
|
||||
n = DuckDB.get_size(input_chunk)
|
||||
chunks = [input_chunk]
|
||||
extra_info_ptr = DuckDB.duckdb_scalar_function_get_extra_info(info)
|
||||
extra_info::DuckDB.ScalarFunction = unsafe_pointer_to_objref(extra_info_ptr)
|
||||
conversion_data = DuckDB.ColumnConversionData(chunks, 1, extra_info.logical_parameters[1], nothing)
|
||||
a_data_converted = DuckDB.DuckDB.convert_column(conversion_data)
|
||||
output_data = DuckDB.get_array(DuckDB.Vec(output), Int, n)
|
||||
|
||||
# # # @info "Values" a_data b_data
|
||||
for row in 1:n
|
||||
result = count(x -> x == 'a', a_data_converted[row])
|
||||
output_data[row] = result
|
||||
end
|
||||
return nothing
|
||||
end
|
||||
|
||||
|
||||
|
||||
function my_string_function_reverse_concat(
|
||||
info::DuckDB.duckdb_function_info,
|
||||
input::DuckDB.duckdb_data_chunk,
|
||||
output::DuckDB.duckdb_vector
|
||||
)
|
||||
|
||||
input_chunk = DuckDB.DataChunk(input, false)
|
||||
output_vec = DuckDB.Vec(output)
|
||||
n = Int64(DuckDB.get_size(input_chunk))
|
||||
chunks = [input_chunk]
|
||||
|
||||
extra_info_ptr = DuckDB.duckdb_scalar_function_get_extra_info(info)
|
||||
|
||||
extra_info::DuckDB.ScalarFunction = unsafe_pointer_to_objref(extra_info_ptr)
|
||||
conversion_data_a = DuckDB.ColumnConversionData(chunks, 1, extra_info.logical_parameters[1], nothing)
|
||||
conversion_data_b = DuckDB.ColumnConversionData(chunks, 2, extra_info.logical_parameters[2], nothing)
|
||||
|
||||
a_data_converted = DuckDB.DuckDB.convert_column(conversion_data_a)
|
||||
b_data_converted = DuckDB.DuckDB.convert_column(conversion_data_b)
|
||||
|
||||
for row in 1:n
|
||||
result = string(reverse(a_data_converted[row]), b_data_converted[row])
|
||||
DuckDB.assign_string_element(output_vec, row, result)
|
||||
end
|
||||
return nothing
|
||||
end
|
||||
|
||||
|
||||
@testset "Test custom scalar functions" begin
|
||||
# Connect to DuckDB
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
|
||||
# Create the test table
|
||||
DuckDB.query(con, "CREATE TABLE test_table AS SELECT i FROM range(10) t(i)")
|
||||
|
||||
# Define logical type BIGINT
|
||||
type_bigint = DuckDB.duckdb_create_logical_type(DuckDB.DUCKDB_TYPE_BIGINT)
|
||||
|
||||
# Test 1: Double Function
|
||||
# Create the scalar function
|
||||
f_double = DuckDB.duckdb_create_scalar_function()
|
||||
DuckDB.duckdb_scalar_function_set_name(f_double, "double_value")
|
||||
|
||||
# Set parameter types
|
||||
DuckDB.duckdb_scalar_function_add_parameter(f_double, type_bigint)
|
||||
|
||||
# Set return type
|
||||
DuckDB.duckdb_scalar_function_set_return_type(f_double, type_bigint)
|
||||
|
||||
# Set the function
|
||||
CMyDoubleFunction = @cfunction(
|
||||
my_double_function,
|
||||
Cvoid,
|
||||
(DuckDB.duckdb_function_info, DuckDB.duckdb_data_chunk, DuckDB.duckdb_vector)
|
||||
)
|
||||
DuckDB.duckdb_scalar_function_set_function(f_double, CMyDoubleFunction)
|
||||
|
||||
# Register the function
|
||||
res = DuckDB.duckdb_register_scalar_function(con.handle, f_double)
|
||||
@test res == DuckDB.DuckDBSuccess
|
||||
|
||||
# Execute the function in a query
|
||||
results = DuckDB.query(con, "SELECT i, double_value(i) as doubled FROM test_table")
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["i", "doubled"]
|
||||
@test size(df, 1) == 10
|
||||
@test df.doubled == df.i .* 2
|
||||
|
||||
# Test 2: Null Function
|
||||
# Create the scalar function
|
||||
f_null = DuckDB.duckdb_create_scalar_function()
|
||||
DuckDB.duckdb_scalar_function_set_name(f_null, "null_if_odd")
|
||||
|
||||
# Set parameter types
|
||||
DuckDB.duckdb_scalar_function_add_parameter(f_null, type_bigint)
|
||||
|
||||
# Set return type
|
||||
DuckDB.duckdb_scalar_function_set_return_type(f_null, type_bigint)
|
||||
|
||||
# Set the function
|
||||
CMyNullFunction = @cfunction(
|
||||
my_null_function,
|
||||
Cvoid,
|
||||
(DuckDB.duckdb_function_info, DuckDB.duckdb_data_chunk, DuckDB.duckdb_vector)
|
||||
)
|
||||
DuckDB.duckdb_scalar_function_set_function(f_null, CMyNullFunction)
|
||||
|
||||
# Register the function
|
||||
res_null = DuckDB.duckdb_register_scalar_function(con.handle, f_null)
|
||||
@test res_null == DuckDB.DuckDBSuccess
|
||||
|
||||
# Execute the function in a query
|
||||
results_null = DuckDB.query(con, "SELECT i, null_if_odd(i) as value_or_null FROM test_table")
|
||||
df_null = DataFrame(results_null)
|
||||
@test names(df_null) == ["i", "value_or_null"]
|
||||
@test size(df_null, 1) == 10
|
||||
expected_values = Vector{Union{Missing, Int64}}(undef, 10)
|
||||
for idx in 1:10
|
||||
i = idx - 1 # Since i ranges from 0 to 9
|
||||
if i % 2 == 0
|
||||
expected_values[idx] = i
|
||||
else
|
||||
expected_values[idx] = missing
|
||||
end
|
||||
end
|
||||
@test all(df_null.value_or_null .=== expected_values)
|
||||
|
||||
# Adjusted Test 3: Error Function
|
||||
# Create the scalar function
|
||||
f_error = DuckDB.duckdb_create_scalar_function()
|
||||
DuckDB.duckdb_scalar_function_set_name(f_error, "error_function")
|
||||
|
||||
# Set parameter types
|
||||
DuckDB.duckdb_scalar_function_add_parameter(f_error, type_bigint)
|
||||
|
||||
# Set return type
|
||||
DuckDB.duckdb_scalar_function_set_return_type(f_error, type_bigint)
|
||||
|
||||
# Set the function
|
||||
CMyErrorFunction = @cfunction(
|
||||
my_error_function,
|
||||
Cvoid,
|
||||
(DuckDB.duckdb_function_info, DuckDB.duckdb_data_chunk, DuckDB.duckdb_vector)
|
||||
)
|
||||
DuckDB.duckdb_scalar_function_set_function(f_error, CMyErrorFunction)
|
||||
|
||||
# Register the function
|
||||
res_error = DuckDB.duckdb_register_scalar_function(con.handle, f_error)
|
||||
@test res_error == DuckDB.DuckDBSuccess
|
||||
|
||||
# Adjusted test to expect ErrorException
|
||||
@test_throws ErrorException DuckDB.query(con, "SELECT error_function(i) FROM test_table")
|
||||
|
||||
# Clean up logical type
|
||||
DuckDB.duckdb_destroy_logical_type(type_bigint)
|
||||
|
||||
# Disconnect and close
|
||||
DuckDB.disconnect(con)
|
||||
DuckDB.close(db)
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
mysum(a, b) = a + b # Dummy function
|
||||
my_reverse(s) = string(reverse(s))
|
||||
|
||||
@testset "UDF_Macro" begin
|
||||
|
||||
# Parse Expression
|
||||
expr = :(mysum(a::Int, b::String)::Int)
|
||||
func_name, func_params, return_value = DuckDB._udf_parse_function_expr(expr)
|
||||
@test func_name == :mysum
|
||||
@test func_params == [(:a, :Int), (:b, :String)]
|
||||
@test return_value == :Int
|
||||
|
||||
# Build expressions
|
||||
var_names, expressions =
|
||||
DuckDB._udf_generate_conversion_expressions(func_params, :log_types, :convert, :param, :chunk)
|
||||
@test var_names == [:param_1, :param_2]
|
||||
@test expressions[1] == :(param_1 = convert(Int, log_types[1], chunk, 1))
|
||||
@test expressions[2] == :(param_2 = convert(String, log_types[2], chunk, 2))
|
||||
|
||||
|
||||
# Generate UDF
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
fun = DuckDB.@create_scalar_function mysum(a::Int, b::Int)::Int
|
||||
#ptr = @cfunction(fun.wrapper, Cvoid, (Ptr{Cvoid},Ptr{Cvoid},Ptr{Cvoid}))
|
||||
|
||||
#ptr = pointer_from_objref(mysum_udf.wrapper)
|
||||
#DuckDB.duckdb_scalar_function_set_function(mysum_udf.handle, ptr)
|
||||
DuckDB.register_scalar_function(con, fun) # Register UDF
|
||||
|
||||
@test_throws ArgumentError DuckDB.register_scalar_function(con, fun) # Register UDF twice
|
||||
|
||||
|
||||
DuckDB.execute(con, "CREATE TABLE test1 (a INT, b INT);")
|
||||
DuckDB.execute(con, "INSERT INTO test1 VALUES ('1', '2'), ('3','4'), ('5', '6')")
|
||||
result = DuckDB.execute(con, "SELECT mysum(a, b) as result FROM test1") |> DataFrame
|
||||
@test result.result == [3, 7, 11]
|
||||
|
||||
end
|
||||
|
||||
@testset "UDF Macro Various Types" begin
|
||||
import Dates
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
|
||||
my_reverse_inner = (s) -> ("Inner:" * string(reverse(s)))
|
||||
fun_is_weekend = (d) -> Dates.dayofweek(d) in (6, 7)
|
||||
date_2020 = (x) -> Dates.Date(2020, 1, 1) + Dates.Day(x) # Dummy function
|
||||
|
||||
my_and(a, b) = a && b
|
||||
my_int_add(a, b) = a + b
|
||||
my_mixed_add(a::Int, b::Float64) = a + b
|
||||
|
||||
df_numbers =
|
||||
DataFrame(a = rand(1:100, 30), b = rand(1:100, 30), c = rand(30), d = rand(Bool, 30), e = rand(Bool, 30))
|
||||
df_strings = DataFrame(a = ["hello", "world", "julia", "duckdb", "🦆DB"])
|
||||
t = Date(2020, 1, 1):Day(1):Date(2020, 12, 31)
|
||||
df_dates = DataFrame(t = t, k = 1:length(t), is_weekend = fun_is_weekend.(t))
|
||||
|
||||
|
||||
DuckDB.register_table(con, df_strings, "test_strings")
|
||||
DuckDB.register_table(con, df_dates, "test_dates")
|
||||
DuckDB.register_table(con, df_numbers, "test_numbers")
|
||||
|
||||
|
||||
# Register UDFs
|
||||
fun_string = DuckDB.@create_scalar_function my_reverse(s::String)::String (s) -> my_reverse_inner(s)
|
||||
DuckDB.register_scalar_function(con, fun_string) # Register UDF
|
||||
|
||||
fun_date = DuckDB.@create_scalar_function is_weekend(d::Date)::Bool fun_is_weekend
|
||||
fun_date2 = DuckDB.@create_scalar_function date_2020(x::Int)::Date date_2020
|
||||
DuckDB.register_scalar_function(con, fun_date) # Register UDF
|
||||
DuckDB.register_scalar_function(con, fun_date2) # Register UDF
|
||||
|
||||
fun_and = DuckDB.@create_scalar_function my_and(a::Bool, b::Bool)::Bool my_and
|
||||
fun_int_add = DuckDB.@create_scalar_function my_int_add(a::Int, b::Int)::Int my_int_add
|
||||
fun_mixed_add = DuckDB.@create_scalar_function my_mixed_add(a::Int, b::Float64)::Float64 my_mixed_add
|
||||
DuckDB.register_scalar_function(con, fun_and)
|
||||
DuckDB.register_scalar_function(con, fun_int_add)
|
||||
DuckDB.register_scalar_function(con, fun_mixed_add)
|
||||
|
||||
result1 = DuckDB.execute(con, "SELECT my_reverse(a) as result FROM test_strings") |> DataFrame
|
||||
@test result1.result == my_reverse_inner.(df_strings.a)
|
||||
|
||||
|
||||
result2_1 = DuckDB.execute(con, "SELECT is_weekend(t) as result FROM test_dates") |> DataFrame
|
||||
@test result2_1.result == fun_is_weekend.(df_dates.t)
|
||||
result2_2 = DuckDB.execute(con, "SELECT date_2020(k) as result FROM test_dates") |> DataFrame
|
||||
@test result2_2.result == date_2020.(df_dates.k)
|
||||
|
||||
result3 = DuckDB.execute(con, "SELECT my_and(d, e) as result FROM test_numbers") |> DataFrame
|
||||
@test result3.result == my_and.(df_numbers.d, df_numbers.e)
|
||||
|
||||
|
||||
result4 = DuckDB.execute(con, "SELECT my_int_add(a, b) as result FROM test_numbers") |> DataFrame
|
||||
@test result4.result == my_int_add.(df_numbers.a, df_numbers.b)
|
||||
|
||||
result5 = DuckDB.execute(con, "SELECT my_mixed_add(a, c) as result FROM test_numbers") |> DataFrame
|
||||
@test result5.result == my_mixed_add.(df_numbers.a, df_numbers.c)
|
||||
|
||||
end
|
||||
|
||||
@testset "UDF Macro Exception" begin
|
||||
|
||||
f_error = function (a)
|
||||
if iseven(a)
|
||||
throw(ArgumentError("Even number"))
|
||||
else
|
||||
return a + 1
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
|
||||
fun_error = DuckDB.@create_scalar_function f_error(a::Int)::Int f_error
|
||||
DuckDB.register_scalar_function(con, fun_error) # Register UDF
|
||||
|
||||
df = DataFrame(a = 1:10)
|
||||
DuckDB.register_table(con, df, "test1")
|
||||
|
||||
@test_throws Exception result = DuckDB.execute(con, "SELECT f_error(a) as result FROM test1") |> DataFrame
|
||||
end
|
||||
|
||||
@testset "UDF Macro Missing Values" begin
|
||||
|
||||
f_add = (a, b) -> a + b
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
|
||||
fun = DuckDB.@create_scalar_function f_add(a::Int, b::Int)::Int f_add
|
||||
DuckDB.register_scalar_function(con, fun)
|
||||
|
||||
df = DataFrame(a = [1, missing, 3], b = [missing, 2, 3])
|
||||
DuckDB.register_table(con, df, "test1")
|
||||
|
||||
result = DuckDB.execute(con, "SELECT f_add(a, b) as result FROM test1") |> DataFrame
|
||||
@test isequal(result.result, [missing, missing, 6])
|
||||
end
|
||||
|
||||
@testset "UDF Macro Benchmark" begin
|
||||
# Check if the generated UDF is comparable to pure Julia or DuckDB expressions
|
||||
#
|
||||
# Currently UDFs takes about as much time as Julia/DuckDB expressions
|
||||
# - The evaluation of the wrapper takes around 20% of the execution time
|
||||
# - slow calls are setindex! and getindex
|
||||
# - table_scan_func is the slowest call
|
||||
|
||||
|
||||
db = DuckDB.DB()
|
||||
con = DuckDB.connect(db)
|
||||
fun_int = DuckDB.@create_scalar_function mysum(a::Int, b::Int)::Int
|
||||
fun_float = DuckDB.@create_scalar_function mysum_f(a::Float64, b::Float64)::Float64 mysum
|
||||
|
||||
DuckDB.register_scalar_function(con, fun_int) # Register UDF
|
||||
DuckDB.register_scalar_function(con, fun_float) # Register UDF
|
||||
|
||||
N = 10_000_000
|
||||
df = DataFrame(a = 1:N, b = 1:N, c = rand(N), d = rand(N))
|
||||
|
||||
DuckDB.register_table(con, df, "test1")
|
||||
|
||||
# Precompile functions
|
||||
precompile(mysum, (Int, Int))
|
||||
precompile(mysum, (Float64, Float64))
|
||||
DuckDB.execute(con, "SELECT mysum(a, b) as result FROM test1")
|
||||
DuckDB.execute(con, "SELECT mysum_f(c, d) as result FROM test1")
|
||||
|
||||
# INTEGER Benchmark
|
||||
|
||||
t1 = @elapsed result_exp = df.a .+ df.b
|
||||
t2 = @elapsed result = DuckDB.execute(con, "SELECT mysum(a, b) as result FROM test1")
|
||||
t3 = @elapsed result2 = DuckDB.execute(con, "SELECT a + b as result FROM test1")
|
||||
@test DataFrame(result).result == result_exp
|
||||
# Prints:
|
||||
# Benchmark Int: Julia Expression: 0.092947083, UDF: 0.078665125, DDB: 0.065306042
|
||||
@info "Benchmark Int: Julia Expression: $t1, UDF: $t2, DDB: $t3"
|
||||
|
||||
|
||||
# FLOAT Benchmark
|
||||
t1 = @elapsed result_exp = df.c .+ df.d
|
||||
t2 = @elapsed result = DuckDB.execute(con, "SELECT mysum_f(c, d) as result FROM test1")
|
||||
t3 = @elapsed result2 = DuckDB.execute(con, "SELECT c + d as result FROM test1")
|
||||
@test DataFrame(result).result ≈ result_exp atol = 1e-6
|
||||
# Prints:
|
||||
# Benchmark Float: Julia Expression: 0.090409625, UDF: 0.080781, DDB: 0.054156167
|
||||
@info "Benchmark Float: Julia Expression: $t1, UDF: $t2, DDB: $t3"
|
||||
end
|
||||
327
external/duckdb/tools/juliapkg/test/test_sqlite.jl
vendored
Normal file
327
external/duckdb/tools/juliapkg/test/test_sqlite.jl
vendored
Normal file
@@ -0,0 +1,327 @@
|
||||
# test_sqlite.jl
|
||||
# tests adopted from SQLite.jl
|
||||
|
||||
using Tables
|
||||
|
||||
function setup_clean_test_db(f::Function, args...)
|
||||
tables = [
|
||||
"album",
|
||||
"artist",
|
||||
"customer",
|
||||
"employee",
|
||||
"genre",
|
||||
"invoice",
|
||||
"invoiceline",
|
||||
"mediatype",
|
||||
"playlist",
|
||||
"playlisttrack",
|
||||
"track"
|
||||
]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
datadir = joinpath(@__DIR__, "../data")
|
||||
for table in tables
|
||||
DBInterface.execute(con, "CREATE TABLE $table AS SELECT * FROM '$datadir/$table.parquet'")
|
||||
end
|
||||
|
||||
try
|
||||
f(con)
|
||||
finally
|
||||
close(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "DB Connection" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
@test con isa DuckDB.DB
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
|
||||
@testset "Issue #207: 32 bit integers" begin
|
||||
setup_clean_test_db() do db
|
||||
ds = DBInterface.execute(db, "SELECT 42::INT64 a FROM Track LIMIT 1") |> columntable
|
||||
@test ds.a[1] isa Int64
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Regular DuckDB Tests" begin
|
||||
setup_clean_test_db() do db
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(db, "just some syntax error")
|
||||
# syntax correct, table missing
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(
|
||||
db,
|
||||
"SELECT name FROM sqlite_nomaster WHERE type='table';"
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "close!(query)" begin
|
||||
setup_clean_test_db() do db
|
||||
qry = DBInterface.execute(db, "SELECT name FROM sqlite_master WHERE type='table';")
|
||||
DBInterface.close!(qry)
|
||||
return DBInterface.close!(qry) # test it doesn't throw on double-close
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Query tables" begin
|
||||
setup_clean_test_db() do db
|
||||
ds = DBInterface.execute(db, "SELECT name FROM sqlite_master WHERE type='table';") |> columntable
|
||||
@test length(ds) == 1
|
||||
@test keys(ds) == (:name,)
|
||||
@test length(ds.name) == 11
|
||||
end
|
||||
end
|
||||
|
||||
@testset "DBInterface.execute([f])" begin
|
||||
setup_clean_test_db() do db
|
||||
|
||||
# pipe approach
|
||||
results = DBInterface.execute(db, "SELECT * FROM Employee;") |> columntable
|
||||
@test length(results) == 15
|
||||
@test length(results[1]) == 8
|
||||
# callable approach
|
||||
@test isequal(DBInterface.execute(columntable, db, "SELECT * FROM Employee"), results)
|
||||
employees_stmt = DBInterface.prepare(db, "SELECT * FROM Employee")
|
||||
@test isequal(columntable(DBInterface.execute(employees_stmt)), results)
|
||||
@test isequal(DBInterface.execute(columntable, employees_stmt), results)
|
||||
@testset "throwing from f()" begin
|
||||
f(::DuckDB.QueryResult) = error("I'm throwing!")
|
||||
@test_throws ErrorException DBInterface.execute(f, employees_stmt)
|
||||
@test_throws ErrorException DBInterface.execute(f, db, "SELECT * FROM Employee")
|
||||
end
|
||||
return DBInterface.close!(employees_stmt)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "isempty(::Query)" begin
|
||||
setup_clean_test_db() do db
|
||||
|
||||
@test !DBInterface.execute(isempty, db, "SELECT * FROM Employee")
|
||||
@test DBInterface.execute(isempty, db, "SELECT * FROM Employee WHERE FirstName='Joanne'")
|
||||
end
|
||||
end
|
||||
|
||||
@testset "empty query has correct schema and return type" begin
|
||||
setup_clean_test_db() do db
|
||||
empty_scheme = DBInterface.execute(Tables.schema, db, "SELECT * FROM Employee WHERE FirstName='Joanne'")
|
||||
all_scheme = DBInterface.execute(Tables.schema, db, "SELECT * FROM Employee WHERE FirstName='Joanne'")
|
||||
@test empty_scheme.names == all_scheme.names
|
||||
@test all(ea -> ea[1] <: ea[2], zip(empty_scheme.types, all_scheme.types))
|
||||
|
||||
empty_tbl = DBInterface.execute(columntable, db, "SELECT * FROM Employee WHERE FirstName='Joanne'")
|
||||
all_tbl = DBInterface.execute(columntable, db, "SELECT * FROM Employee")
|
||||
@test propertynames(empty_tbl) == propertynames(all_tbl)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Create table, run commit/rollback tests" begin
|
||||
setup_clean_test_db() do db
|
||||
DBInterface.execute(db, "create table temp as select * from album")
|
||||
DBInterface.execute(db, "alter table temp add column colyear int")
|
||||
DBInterface.execute(db, "update temp set colyear = 2014")
|
||||
r = DBInterface.execute(db, "select * from temp limit 10") |> columntable
|
||||
@test length(r) == 4 && length(r[1]) == 10
|
||||
@test all(==(2014), r[4])
|
||||
|
||||
@test_throws DuckDB.QueryException DuckDB.rollback(db)
|
||||
@test_throws DuckDB.QueryException DuckDB.commit(db)
|
||||
|
||||
DuckDB.transaction(db)
|
||||
DBInterface.execute(db, "update temp set colyear = 2015")
|
||||
DuckDB.rollback(db)
|
||||
r = DBInterface.execute(db, "select * from temp limit 10") |> columntable
|
||||
@test all(==(2014), r[4])
|
||||
|
||||
DuckDB.transaction(db)
|
||||
DBInterface.execute(db, "update temp set colyear = 2015")
|
||||
DuckDB.commit(db)
|
||||
r = DBInterface.execute(db, "select * from temp limit 10") |> columntable
|
||||
@test all(==(2015), r[4])
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Dates" begin
|
||||
setup_clean_test_db() do db
|
||||
DBInterface.execute(db, "create table temp as select * from album")
|
||||
DBInterface.execute(db, "alter table temp add column dates date")
|
||||
stmt = DBInterface.prepare(db, "update temp set dates = ?")
|
||||
DBInterface.execute(stmt, (Date(2014, 1, 1),))
|
||||
|
||||
r = DBInterface.execute(db, "select * from temp limit 10") |> columntable
|
||||
@test length(r) == 4 && length(r[1]) == 10
|
||||
@test isa(r[4][1], Date)
|
||||
@test all(Bool[x == Date(2014, 1, 1) for x in r[4]])
|
||||
return DBInterface.execute(db, "drop table temp")
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Prepared Statements" begin
|
||||
setup_clean_test_db() do db
|
||||
|
||||
DBInterface.execute(db, "CREATE TABLE temp AS SELECT * FROM Album")
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp LIMIT ?", [3]) |> columntable
|
||||
@test length(r) == 3 && length(r[1]) == 3
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp WHERE Title ILIKE ?", ["%time%"]) |> columntable
|
||||
@test r[1] == [76, 111, 187]
|
||||
DBInterface.execute(db, "INSERT INTO temp VALUES (?1, ?3, ?2)", [0, 0, "Test Album"])
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp WHERE AlbumId = 0") |> columntable
|
||||
@test r[1][1] == 0
|
||||
@test r[2][1] == "Test Album"
|
||||
@test r[3][1] == 0
|
||||
DuckDB.drop!(db, "temp")
|
||||
|
||||
DBInterface.execute(db, "CREATE TABLE temp AS SELECT * FROM Album")
|
||||
# FIXME Does it make sense to use named parameters here?
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp LIMIT ?", (a = 3,)) |> columntable
|
||||
@test length(r) == 3 && length(r[1]) == 3
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp LIMIT ?", a = 3) |> columntable
|
||||
@test length(r) == 3 && length(r[1]) == 3
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp WHERE Title ILIKE ?", (word = "%time%",)) |> columntable
|
||||
@test r[1] == [76, 111, 187]
|
||||
# FIXME: these are supposed to be named parameter tests, but we don't support that yet
|
||||
DBInterface.execute(db, "INSERT INTO temp VALUES (?, ?, ?)", (lid = 0, title = "Test Album", rid = 1))
|
||||
DBInterface.execute(db, "INSERT INTO temp VALUES (?, ?, ?)", lid = 400, title = "Test2 Album", rid = 3)
|
||||
r = DBInterface.execute(db, "SELECT * FROM temp WHERE AlbumId IN (0, 400)") |> columntable
|
||||
@test r[1] == [0, 400]
|
||||
@test r[2] == ["Test Album", "Test2 Album"]
|
||||
@test r[3] == [1, 3]
|
||||
return DuckDB.drop!(db, "temp")
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
@testset "DuckDB to Julia type conversion" begin
|
||||
binddb = DBInterface.connect(DuckDB.DB)
|
||||
DBInterface.execute(
|
||||
binddb,
|
||||
"CREATE TABLE temp (n INTEGER, i1 INT, i2 integer,
|
||||
f1 REAL, f2 FLOAT, f3 DOUBLE,
|
||||
s1 TEXT, s2 CHAR(10), s3 VARCHAR(15), s4 NVARCHAR(5),
|
||||
d1 DATETIME, ts1 TIMESTAMP)"
|
||||
)
|
||||
DBInterface.execute(
|
||||
binddb,
|
||||
"INSERT INTO temp VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
|
||||
[
|
||||
missing,
|
||||
Int64(6),
|
||||
Int64(4),
|
||||
6.4,
|
||||
6.3,
|
||||
Int64(7),
|
||||
"some long text",
|
||||
"short text",
|
||||
"another text",
|
||||
"short",
|
||||
"2021-02-21",
|
||||
"2021-02-12 12:01:32"
|
||||
]
|
||||
)
|
||||
rr = DBInterface.execute(rowtable, binddb, "SELECT * FROM temp")
|
||||
@test length(rr) == 1
|
||||
r = first(rr)
|
||||
@test typeof.(Tuple(r)) ==
|
||||
(Missing, Int32, Int32, Float32, Float32, Float64, String, String, String, String, DateTime, DateTime)
|
||||
# Issue #4809: Concrete `String` types.
|
||||
# Want to test exactly the types `execute` returns, so check the schema directly and
|
||||
# avoid calling `Tuple` or anything else that would narrow the types in the result.
|
||||
schema = Tables.schema(rr)
|
||||
@test nonmissingtype.(schema.types) ==
|
||||
(Int32, Int32, Int32, Float32, Float32, Float64, String, String, String, String, DateTime, DateTime)
|
||||
end
|
||||
|
||||
@testset "Issue #158: Missing DB File" begin
|
||||
@test_throws DuckDB.ConnectionException DuckDB.DB("nonexistentdir/not_there.db")
|
||||
end
|
||||
|
||||
@testset "Issue #180, Query" begin
|
||||
param = "Hello!"
|
||||
query = DBInterface.execute(DuckDB.DB(), "SELECT ?1 UNION ALL SELECT ?1", [param])
|
||||
param = "x"
|
||||
for row in query
|
||||
@test row[1] == "Hello!"
|
||||
GC.gc() # this must NOT garbage collect the "Hello!" bound value
|
||||
end
|
||||
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
DBInterface.execute(db, "CREATE TABLE T (a TEXT, PRIMARY KEY (a))")
|
||||
|
||||
q = DBInterface.prepare(db, "INSERT INTO T VALUES(?)")
|
||||
DBInterface.execute(q, ["a"])
|
||||
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(q, [1, "a"])
|
||||
end
|
||||
|
||||
@testset "show(DB)" begin
|
||||
io = IOBuffer()
|
||||
db = DuckDB.DB()
|
||||
|
||||
show(io, db)
|
||||
@test String(take!(io)) == "DuckDB.DB(\":memory:\")"
|
||||
|
||||
DBInterface.close!(db)
|
||||
end
|
||||
|
||||
@testset "DuckDB.execute()" begin
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
DBInterface.execute(db, "CREATE TABLE T (x INT UNIQUE)")
|
||||
|
||||
q = DBInterface.prepare(db, "INSERT INTO T VALUES(?)")
|
||||
DuckDB.execute(q, (1,))
|
||||
r = DBInterface.execute(db, "SELECT * FROM T") |> columntable
|
||||
@test r[1] == [1]
|
||||
|
||||
DuckDB.execute(q, [2])
|
||||
r = DBInterface.execute(db, "SELECT * FROM T") |> columntable
|
||||
@test r[1] == [1, 2]
|
||||
|
||||
q = DBInterface.prepare(db, "INSERT INTO T VALUES(?)")
|
||||
DuckDB.execute(q, [3])
|
||||
r = DBInterface.execute(columntable, db, "SELECT * FROM T")
|
||||
@test r[1] == [1, 2, 3]
|
||||
|
||||
DuckDB.execute(q, [4])
|
||||
r = DBInterface.execute(columntable, db, "SELECT * FROM T")
|
||||
@test r[1] == [1, 2, 3, 4]
|
||||
|
||||
DuckDB.execute(db, "INSERT INTO T VALUES(?)", [5])
|
||||
r = DBInterface.execute(columntable, db, "SELECT * FROM T")
|
||||
@test r[1] == [1, 2, 3, 4, 5]
|
||||
|
||||
r = DBInterface.execute(db, strip(" SELECT * FROM T ")) |> columntable
|
||||
@test r[1] == [1, 2, 3, 4, 5]
|
||||
|
||||
r = DBInterface.execute(db, "SELECT * FROM T")
|
||||
@test Tables.istable(r)
|
||||
@test Tables.rowaccess(r)
|
||||
@test Tables.rows(r) === r
|
||||
@test Base.IteratorSize(typeof(r)) == Base.SizeUnknown()
|
||||
row = first(r)
|
||||
end
|
||||
|
||||
@testset "last_insert_rowid unsupported" begin
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
@test_throws DuckDB.NotImplementedException DBInterface.lastrowid(db)
|
||||
@test DuckDB.esc_id(["1", "2", "3"]) == "\"1\",\"2\",\"3\""
|
||||
end
|
||||
|
||||
@testset "Escaping" begin
|
||||
@test DuckDB.esc_id(["1", "2", "3"]) == "\"1\",\"2\",\"3\""
|
||||
end
|
||||
|
||||
@testset "Issue #253: Ensure query column names are unique by default" begin
|
||||
db = DuckDB.DB()
|
||||
res = DBInterface.execute(db, "select 1 as x2, 2 as x2, 3 as x2, 4 as x2_2") |> columntable
|
||||
@test res == (x2 = [1], x2_1 = [2], x2_2 = [3], x2_2_1 = [4])
|
||||
end
|
||||
|
||||
@testset "drop!() table name escaping" begin
|
||||
db = DuckDB.DB()
|
||||
DBInterface.execute(db, "CREATE TABLE \"escape 10.0%\"(i INTEGER)")
|
||||
# table exists
|
||||
DBInterface.execute(db, "SELECT * FROM \"escape 10.0%\"")
|
||||
# drop the table
|
||||
DuckDB.drop!(db, "escape 10.0%")
|
||||
# it should no longer exist
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(db, "SELECT * FROM \"escape 10.0%\"")
|
||||
end
|
||||
96
external/duckdb/tools/juliapkg/test/test_stream_data_chunk.jl
vendored
Normal file
96
external/duckdb/tools/juliapkg/test/test_stream_data_chunk.jl
vendored
Normal file
@@ -0,0 +1,96 @@
|
||||
# test_stream_data_chunk.jl
|
||||
|
||||
|
||||
@testset "Test streaming result sets" begin
|
||||
result_types::Vector = Vector()
|
||||
push!(result_types, DuckDB.MaterializedResult)
|
||||
push!(result_types, DuckDB.StreamResult)
|
||||
|
||||
for result_type in result_types
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
res = DBInterface.execute(con, "SELECT * FROM range(10000) t(i)", result_type)
|
||||
|
||||
@test res.names == [:i]
|
||||
@test res.types == [Union{Missing, Int64}]
|
||||
|
||||
# loop over the chunks and perform a sum + count
|
||||
sum::Int64 = 0
|
||||
total_count::Int64 = 0
|
||||
while true
|
||||
# fetch the next chunk
|
||||
chunk = DuckDB.nextDataChunk(res)
|
||||
if chunk === missing
|
||||
# consumed all chunks
|
||||
break
|
||||
end
|
||||
# read the data of this chunk
|
||||
count = DuckDB.get_size(chunk)
|
||||
data = DuckDB.get_array(chunk, 1, Int64)
|
||||
for i in 1:count
|
||||
sum += data[i]
|
||||
end
|
||||
total_count += count
|
||||
DuckDB.destroy_data_chunk(chunk)
|
||||
end
|
||||
@test sum == 49995000
|
||||
@test total_count == 10000
|
||||
end
|
||||
GC.gc(true)
|
||||
end
|
||||
|
||||
@testset "Test giant streaming result" begin
|
||||
# this would take forever if it wasn't streaming
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
res = DBInterface.execute(con, "SELECT * FROM range(1000000000000) t(i)", DuckDB.StreamResult)
|
||||
|
||||
@test res.names == [:i]
|
||||
@test res.types == [Union{Missing, Int64}]
|
||||
|
||||
# fetch the first three chunks
|
||||
for i in 1:3
|
||||
chunk = DuckDB.nextDataChunk(res)
|
||||
@test chunk !== missing
|
||||
DuckDB.destroy_data_chunk(chunk)
|
||||
end
|
||||
DBInterface.close!(res)
|
||||
DBInterface.close!(con)
|
||||
GC.gc(true)
|
||||
end
|
||||
|
||||
@testset "Test streaming data chunk destruction" begin
|
||||
paths = ["types_map.parquet", "types_list.parquet", "types_nested.parquet"]
|
||||
for path in paths
|
||||
# DuckDB "in memory database"
|
||||
connection = DBInterface.connect(DuckDB.DB)
|
||||
statement = DuckDB.Stmt(connection, "SELECT * FROM read_parquet(?, file_row_number=1)", DuckDB.StreamResult)
|
||||
result = DBInterface.execute(statement, [joinpath(@__DIR__, "resources", path)])
|
||||
num_columns = length(result.types)
|
||||
|
||||
while true
|
||||
chunk = DuckDB.nextDataChunk(result)
|
||||
chunk === missing && break # are we done?
|
||||
|
||||
num_rows = DuckDB.get_size(chunk) # number of rows in the retrieved chunk
|
||||
|
||||
row_ids = DuckDB.get_array(chunk, num_columns, Int64)
|
||||
# move over each column, last column are the row_ids
|
||||
for column_idx in 1:(num_columns - 1)
|
||||
column_name::Symbol = result.names[column_idx]
|
||||
|
||||
# Convert from the DuckDB internal types into Julia types
|
||||
duckdb_logical_type = DuckDB.LogicalType(DuckDB.duckdb_column_logical_type(result.handle, column_idx))
|
||||
duckdb_conversion_state = DuckDB.ColumnConversionData([chunk], column_idx, duckdb_logical_type, nothing)
|
||||
duckdb_data = DuckDB.convert_column(duckdb_conversion_state)
|
||||
|
||||
for i in 1:num_rows
|
||||
row_id = row_ids[i] + 1 # julia indices start at 1
|
||||
value = duckdb_data[i]
|
||||
@test value !== missing
|
||||
end
|
||||
end
|
||||
DuckDB.destroy_data_chunk(chunk)
|
||||
end
|
||||
close(connection)
|
||||
end
|
||||
GC.gc(true)
|
||||
end
|
||||
223
external/duckdb/tools/juliapkg/test/test_table_function.jl
vendored
Normal file
223
external/duckdb/tools/juliapkg/test/test_table_function.jl
vendored
Normal file
@@ -0,0 +1,223 @@
|
||||
# test_table_function.jl
|
||||
|
||||
struct MyBindStruct
|
||||
count::Int64
|
||||
|
||||
function MyBindStruct(count::Int64)
|
||||
return new(count)
|
||||
end
|
||||
end
|
||||
|
||||
function my_bind_function(info::DuckDB.BindInfo)
|
||||
DuckDB.add_result_column(info, "forty_two", Int64)
|
||||
|
||||
parameter = DuckDB.get_parameter(info, 0)
|
||||
number = DuckDB.getvalue(parameter, Int64)
|
||||
return MyBindStruct(number)
|
||||
end
|
||||
|
||||
mutable struct MyInitStruct
|
||||
pos::Int64
|
||||
|
||||
function MyInitStruct()
|
||||
return new(0)
|
||||
end
|
||||
end
|
||||
|
||||
function my_init_function(info::DuckDB.InitInfo)
|
||||
return MyInitStruct()
|
||||
end
|
||||
|
||||
function my_main_function_print(info::DuckDB.FunctionInfo, output::DuckDB.DataChunk)
|
||||
bind_info = DuckDB.get_bind_info(info, MyBindStruct)
|
||||
init_info = DuckDB.get_init_info(info, MyInitStruct)
|
||||
|
||||
result_array = DuckDB.get_array(output, 1, Int64)
|
||||
count = 0
|
||||
for i in 1:(DuckDB.VECTOR_SIZE)
|
||||
if init_info.pos >= bind_info.count
|
||||
break
|
||||
end
|
||||
result_array[count + 1] = init_info.pos % 2 == 0 ? 42 : 84
|
||||
# We print within the table function to test behavior with synchronous API calls in Julia table functions
|
||||
println(result_array[count + 1])
|
||||
count += 1
|
||||
init_info.pos += 1
|
||||
end
|
||||
|
||||
DuckDB.set_size(output, count)
|
||||
return
|
||||
end
|
||||
|
||||
function my_main_function(info::DuckDB.FunctionInfo, output::DuckDB.DataChunk)
|
||||
bind_info = DuckDB.get_bind_info(info, MyBindStruct)
|
||||
init_info = DuckDB.get_init_info(info, MyInitStruct)
|
||||
|
||||
result_array = DuckDB.get_array(output, 1, Int64)
|
||||
count = 0
|
||||
for i in 1:(DuckDB.VECTOR_SIZE)
|
||||
if init_info.pos >= bind_info.count
|
||||
break
|
||||
end
|
||||
result_array[count + 1] = init_info.pos % 2 == 0 ? 42 : 84
|
||||
count += 1
|
||||
init_info.pos += 1
|
||||
end
|
||||
|
||||
DuckDB.set_size(output, count)
|
||||
return
|
||||
end
|
||||
|
||||
function my_main_function_nulls(info::DuckDB.FunctionInfo, output::DuckDB.DataChunk)
|
||||
bind_info = DuckDB.get_bind_info(info, MyBindStruct)
|
||||
init_info = DuckDB.get_init_info(info, MyInitStruct)
|
||||
|
||||
result_array = DuckDB.get_array(output, 1, Int64)
|
||||
validity = DuckDB.get_validity(output, 1)
|
||||
count = 0
|
||||
for i in 1:(DuckDB.VECTOR_SIZE)
|
||||
if init_info.pos >= bind_info.count
|
||||
break
|
||||
end
|
||||
if init_info.pos % 2 == 0
|
||||
result_array[count + 1] = 42
|
||||
else
|
||||
DuckDB.setinvalid(validity, count + 1)
|
||||
end
|
||||
count += 1
|
||||
init_info.pos += 1
|
||||
end
|
||||
|
||||
DuckDB.set_size(output, count)
|
||||
return
|
||||
end
|
||||
|
||||
@testset "Test custom table functions that produce IO" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DuckDB.create_table_function(
|
||||
con,
|
||||
"forty_two_print",
|
||||
[Int64],
|
||||
my_bind_function,
|
||||
my_init_function,
|
||||
my_main_function_print
|
||||
)
|
||||
GC.gc()
|
||||
|
||||
# 3 elements
|
||||
results = DBInterface.execute(con, "SELECT * FROM forty_two_print(3)")
|
||||
GC.gc()
|
||||
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["forty_two"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.forty_two == [42, 84, 42]
|
||||
|
||||
# > vsize elements
|
||||
results = DBInterface.execute(con, "SELECT COUNT(*) cnt FROM forty_two_print(10000)")
|
||||
GC.gc()
|
||||
|
||||
df = DataFrame(results)
|
||||
@test df.cnt == [10000]
|
||||
|
||||
# @time begin
|
||||
# results = DBInterface.execute(con, "SELECT SUM(forty_two) cnt FROM forty_two(10000000)")
|
||||
# end
|
||||
# df = DataFrame(results)
|
||||
# println(df)
|
||||
end
|
||||
|
||||
@testset "Test custom table functions" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DuckDB.create_table_function(con, "forty_two", [Int64], my_bind_function, my_init_function, my_main_function)
|
||||
GC.gc()
|
||||
|
||||
# 3 elements
|
||||
results = DBInterface.execute(con, "SELECT * FROM forty_two(3)")
|
||||
GC.gc()
|
||||
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["forty_two"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.forty_two == [42, 84, 42]
|
||||
|
||||
# > vsize elements
|
||||
results = DBInterface.execute(con, "SELECT COUNT(*) cnt FROM forty_two(10000)")
|
||||
GC.gc()
|
||||
|
||||
df = DataFrame(results)
|
||||
@test df.cnt == [10000]
|
||||
|
||||
# @time begin
|
||||
# results = DBInterface.execute(con, "SELECT SUM(forty_two) cnt FROM forty_two(10000000)")
|
||||
# end
|
||||
# df = DataFrame(results)
|
||||
# println(df)
|
||||
|
||||
# return null values from a table function
|
||||
DuckDB.create_table_function(
|
||||
con,
|
||||
"forty_two_nulls",
|
||||
[Int64],
|
||||
my_bind_function,
|
||||
my_init_function,
|
||||
my_main_function_nulls
|
||||
)
|
||||
results = DBInterface.execute(con, "SELECT COUNT(*) total_cnt, COUNT(forty_two) cnt FROM forty_two_nulls(10000)")
|
||||
df = DataFrame(results)
|
||||
@test df.total_cnt == [10000]
|
||||
@test df.cnt == [5000]
|
||||
|
||||
# @time begin
|
||||
# results = DBInterface.execute(con, "SELECT SUM(forty_two) cnt FROM forty_two_nulls(10000000)")
|
||||
# end
|
||||
# df = DataFrame(results)
|
||||
# println(df)
|
||||
end
|
||||
|
||||
function my_bind_error_function(info::DuckDB.BindInfo)
|
||||
throw("bind error")
|
||||
end
|
||||
|
||||
function my_init_error_function(info::DuckDB.InitInfo)
|
||||
throw("init error")
|
||||
end
|
||||
|
||||
function my_main_error_function(info::DuckDB.FunctionInfo, output::DuckDB.DataChunk)
|
||||
throw("runtime error")
|
||||
end
|
||||
|
||||
@testset "Test table function errors" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DuckDB.create_table_function(
|
||||
con,
|
||||
"bind_error_function",
|
||||
[Int64],
|
||||
my_bind_error_function,
|
||||
my_init_function,
|
||||
my_main_function
|
||||
)
|
||||
DuckDB.create_table_function(
|
||||
con,
|
||||
"init_error_function",
|
||||
[Int64],
|
||||
my_bind_function,
|
||||
my_init_error_function,
|
||||
my_main_function
|
||||
)
|
||||
DuckDB.create_table_function(
|
||||
con,
|
||||
"main_error_function",
|
||||
[Int64],
|
||||
my_bind_function,
|
||||
my_init_function,
|
||||
my_main_error_function
|
||||
)
|
||||
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM bind_error_function(3)")
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM init_error_function(3)")
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM main_error_function(3)")
|
||||
end
|
||||
328
external/duckdb/tools/juliapkg/test/test_tbl_scan.jl
vendored
Normal file
328
external/duckdb/tools/juliapkg/test/test_tbl_scan.jl
vendored
Normal file
@@ -0,0 +1,328 @@
|
||||
# test_tbl_scan.jl
|
||||
|
||||
@testset "Test standard DataFrame scan" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
df = DataFrame(a = [1, 2, 3], b = [42, 84, 42])
|
||||
|
||||
DuckDB.register_table(con, df, "my_df")
|
||||
GC.gc()
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
GC.gc()
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["a", "b"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.a == [1, 2, 3]
|
||||
@test df.b == [42, 84, 42]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test standard table scan" begin
|
||||
df = (a = [1, 2, 3], b = [42, 84, 42])
|
||||
for df in [df, Tables.rowtable(df)]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DuckDB.register_table(con, df, "my_df")
|
||||
GC.gc()
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
GC.gc()
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:a, :b)
|
||||
@test Tables.rowcount(df) == 3
|
||||
@test df.a == [1, 2, 3]
|
||||
@test df.b == [42, 84, 42]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Test DataFrame scan with NULL values" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
df = DataFrame(a = [1, missing, 3], b = [missing, 84, missing])
|
||||
|
||||
DuckDB.register_table(con, df, "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["a", "b"]
|
||||
@test size(df, 1) == 3
|
||||
@test isequal(df.a, [1, missing, 3])
|
||||
@test isequal(df.b, [missing, 84, missing])
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test table scan with NULL values" begin
|
||||
df = (a = [1, missing, 3], b = [missing, 84, missing])
|
||||
for df in [df, Tables.rowtable(df)]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DuckDB.register_table(con, df, "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:a, :b)
|
||||
@test Tables.rowcount(df) == 3
|
||||
@test isequal(df.a, [1, missing, 3])
|
||||
@test isequal(df.b, [missing, 84, missing])
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Test DataFrame scan with numerics" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
numeric_types = [Int8, Int16, Int32, Int64, UInt8, UInt16, UInt32, UInt64, Float32, Float64]
|
||||
for type in numeric_types
|
||||
my_df = DataFrame(a = [1, missing, 3], b = [missing, 84, missing])
|
||||
my_df[!, :a] = convert.(Union{type, Missing}, my_df[!, :a])
|
||||
my_df[!, :b] = convert.(Union{type, Missing}, my_df[!, :b])
|
||||
|
||||
DuckDB.register_table(con, my_df, "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = DataFrame(results)
|
||||
@test isequal(df, my_df)
|
||||
end
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test table scan with numerics" begin
|
||||
for tblf in [Tables.columntable, Tables.rowtable]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
numeric_types = [Int8, Int16, Int32, Int64, UInt8, UInt16, UInt32, UInt64, Float32, Float64]
|
||||
for type in numeric_types
|
||||
my_df = (a = [1, missing, 3], b = [missing, 84, missing])
|
||||
my_df = map(my_df) do col
|
||||
return convert.(Union{type, Missing}, col)
|
||||
end
|
||||
|
||||
DuckDB.register_table(con, tblf(my_df), "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = columntable(results)
|
||||
@test isequal(df, my_df)
|
||||
end
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Test DataFrame scan with various types" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# boolean
|
||||
my_df = DataFrame(a = [true, false, missing])
|
||||
|
||||
DuckDB.register_table(con, my_df, "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = DataFrame(results)
|
||||
@test isequal(df, my_df)
|
||||
|
||||
# date/time/timestamp
|
||||
my_df = DataFrame(
|
||||
date = [Date(1992, 9, 20), missing, Date(1950, 2, 3)],
|
||||
time = [Time(23, 3, 1), Time(11, 49, 33), missing],
|
||||
timestamp = [DateTime(1992, 9, 20, 23, 3, 1), DateTime(1950, 2, 3, 11, 49, 3), missing]
|
||||
)
|
||||
|
||||
DuckDB.register_table(con, my_df, "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = DataFrame(results)
|
||||
@test isequal(df, my_df)
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test table scan with various types" begin
|
||||
for tblf in [Tables.columntable, Tables.rowtable]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# boolean
|
||||
my_df = (a = [true, false, missing],)
|
||||
|
||||
DuckDB.register_table(con, tblf(my_df), "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = columntable(results)
|
||||
@test isequal(df, my_df)
|
||||
|
||||
# date/time/timestamp
|
||||
my_df = (
|
||||
date = [Date(1992, 9, 20), missing, Date(1950, 2, 3)],
|
||||
time = [Time(23, 3, 1), Time(11, 49, 33), missing],
|
||||
timestamp = [DateTime(1992, 9, 20, 23, 3, 1), DateTime(1950, 2, 3, 11, 49, 3), missing]
|
||||
)
|
||||
|
||||
DuckDB.register_table(con, tblf(my_df), "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = columntable(results)
|
||||
@test isequal(df, my_df)
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Test DataFrame scan with strings" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# date/time/timestamp
|
||||
my_df = DataFrame(str = ["hello", "this is a very long string", missing, "obligatory mühleisen"])
|
||||
|
||||
DuckDB.register_table(con, my_df, "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = DataFrame(results)
|
||||
@test isequal(df, my_df)
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test table scan with strings" begin
|
||||
for tblf in [Tables.columntable, Tables.rowtable]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
# date/time/timestamp
|
||||
my_df = (str = ["hello", "this is a very long string", missing, "obligatory mühleisen"],)
|
||||
|
||||
DuckDB.register_table(con, tblf(my_df), "my_df")
|
||||
|
||||
results = DBInterface.execute(con, "SELECT * FROM my_df")
|
||||
df = columntable(results)
|
||||
@test isequal(df, my_df)
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Test DataFrame scan projection pushdown" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
df = DataFrame(a = [1, 2, 3], b = [42, 84, 42], c = [3, 7, 18])
|
||||
|
||||
DuckDB.register_table(con, df, "my_df")
|
||||
GC.gc()
|
||||
|
||||
results = DBInterface.execute(con, "SELECT b FROM my_df")
|
||||
GC.gc()
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["b"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.b == [42, 84, 42]
|
||||
|
||||
results = DBInterface.execute(con, "SELECT c, b FROM my_df")
|
||||
GC.gc()
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["c", "b"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.b == [42, 84, 42]
|
||||
@test df.c == [3, 7, 18]
|
||||
|
||||
results = DBInterface.execute(con, "SELECT c, a, a FROM my_df")
|
||||
GC.gc()
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["c", "a", "a_1"]
|
||||
@test size(df, 1) == 3
|
||||
@test df.c == [3, 7, 18]
|
||||
@test df.a == [1, 2, 3]
|
||||
@test df.a_1 == [1, 2, 3]
|
||||
|
||||
results = DBInterface.execute(con, "SELECT COUNT(*) cnt FROM my_df")
|
||||
GC.gc()
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["cnt"]
|
||||
@test size(df, 1) == 1
|
||||
@test df.cnt == [3]
|
||||
|
||||
GC.gc()
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test table scan projection pushdown" begin
|
||||
for tblf in [Tables.columntable, Tables.rowtable]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
df = (a = [1, 2, 3], b = [42, 84, 42], c = [3, 7, 18])
|
||||
|
||||
DuckDB.register_table(con, tblf(df), "my_df")
|
||||
GC.gc()
|
||||
|
||||
results = DBInterface.execute(con, "SELECT b FROM my_df")
|
||||
GC.gc()
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:b,)
|
||||
@test Tables.rowcount(df) == 3
|
||||
@test df.b == [42, 84, 42]
|
||||
|
||||
results = DBInterface.execute(con, "SELECT c, b FROM my_df")
|
||||
GC.gc()
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:c, :b)
|
||||
@test Tables.rowcount(df) == 3
|
||||
@test df.b == [42, 84, 42]
|
||||
@test df.c == [3, 7, 18]
|
||||
|
||||
results = DBInterface.execute(con, "SELECT c, a, a FROM my_df")
|
||||
GC.gc()
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:c, :a, :a_1)
|
||||
@test Tables.rowcount(df) == 3
|
||||
@test df.c == [3, 7, 18]
|
||||
@test df.a == [1, 2, 3]
|
||||
@test df.a_1 == [1, 2, 3]
|
||||
|
||||
results = DBInterface.execute(con, "SELECT COUNT(*) cnt FROM my_df")
|
||||
GC.gc()
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:cnt,)
|
||||
@test Tables.rowcount(df) == 1
|
||||
@test df.cnt == [3]
|
||||
|
||||
GC.gc()
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
|
||||
@testset "Test large DataFrame scan" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
my_df = DataFrame(DBInterface.execute(con, "SELECT i%5 AS i FROM range(10000000) tbl(i)"))
|
||||
|
||||
DuckDB.register_table(con, my_df, "my_df")
|
||||
GC.gc()
|
||||
|
||||
results = DBInterface.execute(con, "SELECT SUM(i) AS sum FROM my_df")
|
||||
GC.gc()
|
||||
df = DataFrame(results)
|
||||
@test names(df) == ["sum"]
|
||||
@test size(df, 1) == 1
|
||||
@test df.sum == [20000000]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
|
||||
@testset "Test large table scan" begin
|
||||
for tblf in [Tables.columntable, Tables.rowtable]
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
my_df = tblf(DBInterface.execute(con, "SELECT i%5 AS i FROM range(10000000) tbl(i)"))
|
||||
|
||||
DuckDB.register_table(con, my_df, "my_df")
|
||||
GC.gc()
|
||||
|
||||
results = DBInterface.execute(con, "SELECT SUM(i) AS sum FROM my_df")
|
||||
GC.gc()
|
||||
df = columntable(results)
|
||||
@test Tables.columnnames(df) == (:sum,)
|
||||
@test Tables.rowcount(df) == 1
|
||||
@test df.sum == [20000000]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
end
|
||||
12
external/duckdb/tools/juliapkg/test/test_threading.jl
vendored
Normal file
12
external/duckdb/tools/juliapkg/test/test_threading.jl
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
# test_threading.jl
|
||||
|
||||
@testset "Test threading" begin
|
||||
con = DBInterface.connect(DuckDB.DB)
|
||||
|
||||
DBInterface.execute(con, "CREATE TABLE integers AS SELECT * FROM range(100000000) t(i)")
|
||||
results = DBInterface.execute(con, "SELECT SUM(i) sum FROM integers")
|
||||
df = DataFrame(results)
|
||||
@test df.sum == [4999999950000000]
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
59
external/duckdb/tools/juliapkg/test/test_tpch.jl
vendored
Normal file
59
external/duckdb/tools/juliapkg/test/test_tpch.jl
vendored
Normal file
@@ -0,0 +1,59 @@
|
||||
# test_tpch.jl
|
||||
|
||||
# DuckDB needs to have been built with TPCH (BUILD_TPCH=1) to run this test!
|
||||
|
||||
@testset "Test TPC-H" begin
|
||||
sf = "0.1"
|
||||
|
||||
# load TPC-H into DuckDB
|
||||
native_con = DBInterface.connect(DuckDB.DB)
|
||||
try
|
||||
DBInterface.execute(native_con, "CALL dbgen(sf=$sf)")
|
||||
catch
|
||||
@info "TPC-H extension not available; skipping"
|
||||
return
|
||||
end
|
||||
|
||||
# convert all tables to Julia DataFrames
|
||||
customer = DataFrame(DBInterface.execute(native_con, "SELECT * FROM customer"))
|
||||
lineitem = DataFrame(DBInterface.execute(native_con, "SELECT * FROM lineitem"))
|
||||
nation = DataFrame(DBInterface.execute(native_con, "SELECT * FROM nation"))
|
||||
orders = DataFrame(DBInterface.execute(native_con, "SELECT * FROM orders"))
|
||||
part = DataFrame(DBInterface.execute(native_con, "SELECT * FROM part"))
|
||||
partsupp = DataFrame(DBInterface.execute(native_con, "SELECT * FROM partsupp"))
|
||||
region = DataFrame(DBInterface.execute(native_con, "SELECT * FROM region"))
|
||||
supplier = DataFrame(DBInterface.execute(native_con, "SELECT * FROM supplier"))
|
||||
|
||||
# now open a new in-memory database, and register the dataframes there
|
||||
df_con = DBInterface.connect(DuckDB.DB)
|
||||
DuckDB.register_table(df_con, customer, "customer")
|
||||
DuckDB.register_table(df_con, lineitem, "lineitem")
|
||||
DuckDB.register_table(df_con, nation, "nation")
|
||||
DuckDB.register_table(df_con, orders, "orders")
|
||||
DuckDB.register_table(df_con, part, "part")
|
||||
DuckDB.register_table(df_con, partsupp, "partsupp")
|
||||
DuckDB.register_table(df_con, region, "region")
|
||||
DuckDB.register_table(df_con, supplier, "supplier")
|
||||
GC.gc()
|
||||
|
||||
# run all the queries
|
||||
for i in 1:22
|
||||
# print("Q$i\n")
|
||||
# for each query, compare the results of the query ran on the original tables
|
||||
# versus the result when run on the Julia DataFrames
|
||||
res = DataFrame(DBInterface.execute(df_con, "PRAGMA tpch($i)"))
|
||||
res2 = DataFrame(DBInterface.execute(native_con, "PRAGMA tpch($i)"))
|
||||
@test isequal(res, res2)
|
||||
# print("Native DuckDB\n")
|
||||
# @time begin
|
||||
# results = DBInterface.execute(native_con, "PRAGMA tpch($i)")
|
||||
# end
|
||||
# print("DataFrame\n")
|
||||
# @time begin
|
||||
# results = DBInterface.execute(df_con, "PRAGMA tpch($i)")
|
||||
# end
|
||||
end
|
||||
|
||||
DBInterface.close!(df_con)
|
||||
DBInterface.close!(native_con)
|
||||
end
|
||||
54
external/duckdb/tools/juliapkg/test/test_tpch_multithread.jl
vendored
Normal file
54
external/duckdb/tools/juliapkg/test/test_tpch_multithread.jl
vendored
Normal file
@@ -0,0 +1,54 @@
|
||||
# test_tpch_multithread.jl
|
||||
|
||||
# DuckDB needs to have been built with TPCH (BUILD_TPCH=1) to run this test!
|
||||
|
||||
function test_tpch_multithread()
|
||||
sf = "0.10"
|
||||
|
||||
# load TPC-H into DuckDB
|
||||
native_con = DBInterface.connect(DuckDB.DB)
|
||||
try
|
||||
DBInterface.execute(native_con, "CALL dbgen(sf=$sf)")
|
||||
catch
|
||||
@info "TPC-H extension not available; skipping"
|
||||
return
|
||||
end
|
||||
|
||||
# convert all tables to Julia DataFrames
|
||||
customer = DataFrame(DBInterface.execute(native_con, "SELECT * FROM customer"))
|
||||
lineitem = DataFrame(DBInterface.execute(native_con, "SELECT * FROM lineitem"))
|
||||
nation = DataFrame(DBInterface.execute(native_con, "SELECT * FROM nation"))
|
||||
orders = DataFrame(DBInterface.execute(native_con, "SELECT * FROM orders"))
|
||||
part = DataFrame(DBInterface.execute(native_con, "SELECT * FROM part"))
|
||||
partsupp = DataFrame(DBInterface.execute(native_con, "SELECT * FROM partsupp"))
|
||||
region = DataFrame(DBInterface.execute(native_con, "SELECT * FROM region"))
|
||||
supplier = DataFrame(DBInterface.execute(native_con, "SELECT * FROM supplier"))
|
||||
|
||||
id = Threads.threadid()
|
||||
# now open a new in-memory database, and register the dataframes there
|
||||
df_con = DBInterface.connect(DuckDB.DB)
|
||||
DuckDB.register_table(df_con, customer, "customer")
|
||||
DuckDB.register_table(df_con, lineitem, "lineitem")
|
||||
DuckDB.register_table(df_con, nation, "nation")
|
||||
DuckDB.register_table(df_con, orders, "orders")
|
||||
DuckDB.register_table(df_con, part, "part")
|
||||
DuckDB.register_table(df_con, partsupp, "partsupp")
|
||||
DuckDB.register_table(df_con, region, "region")
|
||||
DuckDB.register_table(df_con, supplier, "supplier")
|
||||
GC.gc()
|
||||
|
||||
# Execute all the queries
|
||||
for _ in 1:10
|
||||
for i in 1:22
|
||||
|
||||
print("T:$id | Q:$i\n")
|
||||
res = DataFrame(DBInterface.execute(df_con, "PRAGMA tpch($i)"))
|
||||
end
|
||||
end
|
||||
DBInterface.close!(df_con)
|
||||
return DBInterface.close!(native_con)
|
||||
end
|
||||
|
||||
@testset "Test TPC-H Stresstest" begin
|
||||
test_tpch_multithread()
|
||||
end
|
||||
23
external/duckdb/tools/juliapkg/test/test_transaction.jl
vendored
Normal file
23
external/duckdb/tools/juliapkg/test/test_transaction.jl
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
# test_transaction.jl
|
||||
|
||||
@testset "Test DBInterface.transaction" begin
|
||||
con = DBInterface.connect(DuckDB.DB, ":memory:")
|
||||
|
||||
# throw an exception in DBInterface.transaction
|
||||
# this should cause a rollback to happen
|
||||
@test_throws DuckDB.QueryException DBInterface.transaction(con) do
|
||||
DBInterface.execute(con, "CREATE TABLE integers(i INTEGER)")
|
||||
return DBInterface.execute(con, "SELEC")
|
||||
end
|
||||
|
||||
# verify that the table does not exist
|
||||
@test_throws DuckDB.QueryException DBInterface.execute(con, "SELECT * FROM integers")
|
||||
|
||||
# no exception, this should work and be committed
|
||||
DBInterface.transaction(con) do
|
||||
return DBInterface.execute(con, "CREATE TABLE integers(i INTEGER)")
|
||||
end
|
||||
DBInterface.execute(con, "SELECT * FROM integers")
|
||||
|
||||
DBInterface.close!(con)
|
||||
end
|
||||
43
external/duckdb/tools/juliapkg/test/test_union_type.jl
vendored
Normal file
43
external/duckdb/tools/juliapkg/test/test_union_type.jl
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
|
||||
# test_union_type.jl
|
||||
|
||||
@testset "Test Union Type" begin
|
||||
db = DBInterface.connect(DuckDB.DB)
|
||||
con = DBInterface.connect(db)
|
||||
|
||||
DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
create table tbl (
|
||||
u UNION (a BOOL, b VARCHAR)
|
||||
);
|
||||
"""
|
||||
)
|
||||
DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
insert into tbl VALUES('str'), (true);
|
||||
"""
|
||||
)
|
||||
df = DataFrame(DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
select u from tbl;
|
||||
"""
|
||||
))
|
||||
@test isequal(df.u, ["str", true])
|
||||
DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
insert into tbl VALUES(NULL);
|
||||
"""
|
||||
)
|
||||
df = DataFrame(DBInterface.execute(
|
||||
con,
|
||||
"""
|
||||
select u from tbl;
|
||||
"""
|
||||
))
|
||||
@test isequal(df.u, ["str", true, missing])
|
||||
|
||||
end
|
||||
22
external/duckdb/tools/juliapkg/update_api.sh
vendored
Executable file
22
external/duckdb/tools/juliapkg/update_api.sh
vendored
Executable file
@@ -0,0 +1,22 @@
|
||||
set -euo pipefail
|
||||
|
||||
|
||||
echo "Updating api.jl..."
|
||||
|
||||
OLD_API_FILE=tools/juliapkg/src/api_old.jl
|
||||
ORIG_DIR=$(pwd)
|
||||
GIR_ROOT_DIR=$(git rev-parse --show-toplevel)
|
||||
cd "$GIR_ROOT_DIR"
|
||||
|
||||
|
||||
|
||||
# Generate the Julia API
|
||||
python tools/juliapkg/scripts/generate_c_api_julia.py \
|
||||
--auto-1-index \
|
||||
--capi-dir src/include/duckdb/main/capi/header_generation \
|
||||
tools/juliapkg/src/api.jl
|
||||
|
||||
|
||||
echo "Formatting..."
|
||||
cd "$ORIG_DIR"
|
||||
./format.sh
|
||||
53
external/duckdb/tools/release-pip.py
vendored
Normal file
53
external/duckdb/tools/release-pip.py
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
import urllib.request, ssl, json, tempfile, os, sys, re, subprocess
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: [release_tag]")
|
||||
exit(1)
|
||||
|
||||
if os.getenv('TWINE_USERNAME') is None or os.getenv('TWINE_PASSWORD') is None:
|
||||
print("Can't find TWINE_USERNAME or TWINE_PASSWORD in env ")
|
||||
exit(-1)
|
||||
|
||||
release_name = sys.argv[1]
|
||||
release_rev = None
|
||||
|
||||
request = urllib.request.Request("https://api.github.com/repos/duckdb/duckdb/git/refs/tags/")
|
||||
with urllib.request.urlopen(request, context=ssl._create_unverified_context()) as url:
|
||||
data = json.loads(url.read().decode())
|
||||
|
||||
for ref in data:
|
||||
ref_name = ref['ref'].replace('refs/tags/', '')
|
||||
if ref_name == release_name:
|
||||
release_rev = ref['object']['sha']
|
||||
|
||||
if release_rev is None:
|
||||
print("Could not find hash for tag %s" % sys.argv[1])
|
||||
exit(-2)
|
||||
|
||||
print("Using sha %s for release %s" % (release_rev, release_name))
|
||||
|
||||
binurl = "http://download.duckdb.org/rev/%s/python/" % release_rev
|
||||
# assemble python files for release
|
||||
|
||||
fdir = tempfile.mkdtemp()
|
||||
print(fdir)
|
||||
|
||||
upload_files = []
|
||||
request = urllib.request.Request(binurl)
|
||||
with urllib.request.urlopen(request, context=ssl._create_unverified_context()) as url:
|
||||
data = url.read().decode()
|
||||
f_matches = re.findall(r'href="([^"]+\.(whl|tar\.gz))"', data)
|
||||
for m in f_matches:
|
||||
if '.dev' in m[0]:
|
||||
continue
|
||||
print("Downloading %s" % m[0])
|
||||
url = binurl + '/' + m[0]
|
||||
local_file = fdir + '/' + m[0]
|
||||
urllib.request.urlretrieve(url, local_file)
|
||||
upload_files.append(local_file)
|
||||
|
||||
if len(upload_files) < 1:
|
||||
print("Could not find any binaries")
|
||||
exit(-3)
|
||||
|
||||
subprocess.run(['twine', 'upload', '--skip-existing'] + upload_files)
|
||||
69
external/duckdb/tools/shell/CMakeLists.txt
vendored
Normal file
69
external/duckdb/tools/shell/CMakeLists.txt
vendored
Normal file
@@ -0,0 +1,69 @@
|
||||
include_directories(include)
|
||||
include_directories(../sqlite3_api_wrapper/include)
|
||||
if(NOT WIN32)
|
||||
add_subdirectory(linenoise)
|
||||
add_definitions(-DHAVE_LINENOISE=1)
|
||||
include_directories(../../third_party/utf8proc/include)
|
||||
include_directories(linenoise/include)
|
||||
endif()
|
||||
set(SHELL_SOURCES ${SHELL_SOURCES} shell.cpp shell_renderer.cpp
|
||||
shell_highlight.cpp)
|
||||
|
||||
option(STATIC_LIBCPP "Statically link CLI to libc++" FALSE)
|
||||
|
||||
add_executable(shell ${SHELL_SOURCES})
|
||||
target_link_libraries(shell sqlite3_api_wrapper_static
|
||||
${DUCKDB_EXTRA_LINK_FLAGS})
|
||||
link_threads(shell "")
|
||||
if(STATIC_LIBCPP)
|
||||
message("Statically linking CLI")
|
||||
target_link_libraries(shell -static-libstdc++ -static-libgcc)
|
||||
endif()
|
||||
|
||||
if(NOT AMALGAMATION_BUILD AND NOT WIN32)
|
||||
target_link_libraries(shell duckdb_utf8proc)
|
||||
endif()
|
||||
|
||||
function(ensure_variable_is_number INPUT_VERSION OUT_RESULT)
|
||||
if(NOT "${${INPUT_VERSION}}" MATCHES "^[0-9]+$")
|
||||
message(
|
||||
WARNING
|
||||
"VERSION PARAMETER ${INPUT_VERSION} \"${${INPUT_VERSION}}\" IS NOT A NUMBER - SETTING TO 0"
|
||||
)
|
||||
set(${OUT_RESULT}
|
||||
0
|
||||
PARENT_SCOPE)
|
||||
else()
|
||||
set(${OUT_RESULT}
|
||||
${${INPUT_VERSION}}
|
||||
PARENT_SCOPE)
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
if(WIN32 AND NOT MINGW)
|
||||
string(TIMESTAMP DUCKDB_COPYRIGHT_YEAR "%Y")
|
||||
ensure_variable_is_number(DUCKDB_MAJOR_VERSION RC_MAJOR_VERSION)
|
||||
ensure_variable_is_number(DUCKDB_MINOR_VERSION RC_MINOR_VERSION)
|
||||
ensure_variable_is_number(DUCKDB_PATCH_VERSION RC_PATCH_VERSION)
|
||||
ensure_variable_is_number(DUCKDB_DEV_ITERATION RC_DEV_ITERATION)
|
||||
|
||||
set(CMAKE_RC_FLAGS
|
||||
"${CMAKE_RC_FLAGS} -D DUCKDB_VERSION=\"${DUCKDB_VERSION}\"")
|
||||
set(CMAKE_RC_FLAGS
|
||||
"${CMAKE_RC_FLAGS} -D DUCKDB_MAJOR_VERSION=\"${RC_MAJOR_VERSION}\"")
|
||||
set(CMAKE_RC_FLAGS
|
||||
"${CMAKE_RC_FLAGS} -D DUCKDB_MINOR_VERSION=\"${RC_MINOR_VERSION}\"")
|
||||
set(CMAKE_RC_FLAGS
|
||||
"${CMAKE_RC_FLAGS} -D DUCKDB_PATCH_VERSION=\"${RC_PATCH_VERSION}\"")
|
||||
set(CMAKE_RC_FLAGS
|
||||
"${CMAKE_RC_FLAGS} -D DUCKDB_DEV_ITERATION=\"${RC_DEV_ITERATION}\"")
|
||||
set(CMAKE_RC_FLAGS
|
||||
"${CMAKE_RC_FLAGS} -D DUCKDB_COPYRIGHT_YEAR=\"${DUCKDB_COPYRIGHT_YEAR}\"")
|
||||
target_sources(shell PRIVATE rc/duckdb.rc)
|
||||
endif()
|
||||
|
||||
set_target_properties(shell PROPERTIES OUTPUT_NAME duckdb)
|
||||
set_target_properties(shell PROPERTIES RUNTIME_OUTPUT_DIRECTORY
|
||||
${PROJECT_BINARY_DIR})
|
||||
|
||||
install(TARGETS shell RUNTIME DESTINATION "${INSTALL_BIN_DIR}")
|
||||
50
external/duckdb/tools/shell/include/shell_highlight.hpp
vendored
Normal file
50
external/duckdb/tools/shell/include/shell_highlight.hpp
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// shell_highlight.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "shell_state.hpp"
|
||||
|
||||
namespace duckdb_shell {
|
||||
|
||||
enum class PrintColor { STANDARD, RED, YELLOW, GREEN, GRAY, BLUE, MAGENTA, CYAN, WHITE };
|
||||
|
||||
enum class PrintIntensity { STANDARD, BOLD, UNDERLINE, BOLD_UNDERLINE };
|
||||
|
||||
enum class HighlightElementType : uint32_t {
|
||||
ERROR_TOKEN = 0,
|
||||
KEYWORD,
|
||||
NUMERIC_CONSTANT,
|
||||
STRING_CONSTANT,
|
||||
LINE_INDICATOR,
|
||||
COLUMN_NAME,
|
||||
COLUMN_TYPE,
|
||||
NUMERIC_VALUE,
|
||||
STRING_VALUE,
|
||||
TEMPORAL_VALUE,
|
||||
NULL_VALUE,
|
||||
FOOTER,
|
||||
LAYOUT,
|
||||
NONE
|
||||
};
|
||||
|
||||
struct ShellHighlight {
|
||||
explicit ShellHighlight(ShellState &state);
|
||||
|
||||
void PrintText(const string &text, PrintOutput output, PrintColor color, PrintIntensity intensity);
|
||||
void PrintText(const string &text, PrintOutput output, HighlightElementType type);
|
||||
|
||||
void PrintError(string error_msg);
|
||||
|
||||
bool SetColor(const char *element_type, const char *color, const char *intensity);
|
||||
|
||||
public:
|
||||
ShellState &state;
|
||||
};
|
||||
|
||||
} // namespace duckdb_shell
|
||||
76
external/duckdb/tools/shell/include/shell_renderer.hpp
vendored
Normal file
76
external/duckdb/tools/shell/include/shell_renderer.hpp
vendored
Normal file
@@ -0,0 +1,76 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// shell_renderer.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "shell_state.hpp"
|
||||
|
||||
namespace duckdb_shell {
|
||||
struct ShellState;
|
||||
|
||||
class ShellRenderer {
|
||||
public:
|
||||
explicit ShellRenderer(ShellState &state);
|
||||
virtual ~ShellRenderer() = default;
|
||||
|
||||
ShellState &state;
|
||||
bool show_header;
|
||||
string col_sep;
|
||||
string row_sep;
|
||||
|
||||
public:
|
||||
static bool IsColumnar(RenderMode mode);
|
||||
};
|
||||
|
||||
struct ColumnarResult {
|
||||
idx_t column_count = 0;
|
||||
vector<string> data;
|
||||
vector<int> types;
|
||||
vector<idx_t> column_width;
|
||||
vector<bool> right_align;
|
||||
vector<const char *> type_names;
|
||||
};
|
||||
|
||||
struct RowResult {
|
||||
vector<const char *> column_names;
|
||||
vector<const char *> data;
|
||||
vector<int> types;
|
||||
sqlite3_stmt *pStmt = nullptr;
|
||||
};
|
||||
|
||||
class ColumnRenderer : public ShellRenderer {
|
||||
public:
|
||||
explicit ColumnRenderer(ShellState &state);
|
||||
|
||||
virtual void RenderHeader(ColumnarResult &result) = 0;
|
||||
virtual void RenderFooter(ColumnarResult &result);
|
||||
|
||||
virtual const char *GetColumnSeparator() = 0;
|
||||
virtual const char *GetRowSeparator() = 0;
|
||||
virtual const char *GetRowStart() {
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
void RenderAlignedValue(ColumnarResult &result, idx_t i);
|
||||
};
|
||||
|
||||
class RowRenderer : public ShellRenderer {
|
||||
public:
|
||||
explicit RowRenderer(ShellState &state);
|
||||
|
||||
bool first_row = true;
|
||||
|
||||
public:
|
||||
virtual void Render(RowResult &result);
|
||||
|
||||
virtual void RenderHeader(RowResult &result);
|
||||
virtual void RenderRow(RowResult &result) = 0;
|
||||
virtual void RenderFooter(RowResult &result);
|
||||
};
|
||||
|
||||
} // namespace duckdb_shell
|
||||
215
external/duckdb/tools/shell/include/shell_state.hpp
vendored
Normal file
215
external/duckdb/tools/shell/include/shell_state.hpp
vendored
Normal file
@@ -0,0 +1,215 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// shell_state.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include <vector>
|
||||
#include <string>
|
||||
#include <cstdint>
|
||||
#include <memory>
|
||||
#include "duckdb/common/string_util.hpp"
|
||||
#include "duckdb/common/unique_ptr.hpp"
|
||||
|
||||
struct sqlite3;
|
||||
struct sqlite3_stmt;
|
||||
enum class MetadataResult : uint8_t;
|
||||
|
||||
namespace duckdb_shell {
|
||||
using duckdb::unique_ptr;
|
||||
using std::string;
|
||||
using std::vector;
|
||||
struct ColumnarResult;
|
||||
struct RowResult;
|
||||
class ColumnRenderer;
|
||||
class RowRenderer;
|
||||
|
||||
using idx_t = uint64_t;
|
||||
|
||||
enum class RenderMode : uint32_t {
|
||||
LINE = 0, /* One column per line. Blank line between records */
|
||||
COLUMN, /* One record per line in neat columns */
|
||||
LIST, /* One record per line with a separator */
|
||||
SEMI, /* Same as RenderMode::List but append ";" to each line */
|
||||
HTML, /* Generate an XHTML table */
|
||||
INSERT, /* Generate SQL "insert" statements */
|
||||
QUOTE, /* Quote values as for SQL */
|
||||
TCL, /* Generate ANSI-C or TCL quoted elements */
|
||||
CSV, /* Quote strings, numbers are plain */
|
||||
EXPLAIN, /* Like RenderMode::Column, but do not truncate data */
|
||||
ASCII, /* Use ASCII unit and record separators (0x1F/0x1E) */
|
||||
PRETTY, /* Pretty-print schemas */
|
||||
EQP, /* Converts EXPLAIN QUERY PLAN output into a graph */
|
||||
JSON, /* Output JSON */
|
||||
MARKDOWN, /* Markdown formatting */
|
||||
TABLE, /* MySQL-style table formatting */
|
||||
BOX, /* Unicode box-drawing characters */
|
||||
LATEX, /* Latex tabular formatting */
|
||||
TRASH, /* Discard output */
|
||||
JSONLINES, /* Output JSON Lines */
|
||||
DUCKBOX /* Unicode box drawing - using DuckDB's own renderer */
|
||||
};
|
||||
|
||||
enum class PrintOutput { STDOUT, STDERR };
|
||||
|
||||
enum class InputMode { STANDARD, FILE };
|
||||
|
||||
enum class LargeNumberRendering { NONE = 0, FOOTER = 1, ALL = 2, DEFAULT = 3 };
|
||||
|
||||
/*
|
||||
** These are the allowed shellFlgs values
|
||||
*/
|
||||
#define SHFLG_Pagecache 0x00000001 /* The --pagecache option is used */
|
||||
#define SHFLG_Lookaside 0x00000002 /* Lookaside memory is used */
|
||||
#define SHFLG_Backslash 0x00000004 /* The --backslash option is used */
|
||||
#define SHFLG_PreserveRowid 0x00000008 /* .dump preserves rowid values */
|
||||
#define SHFLG_Newlines 0x00000010 /* .dump --newline flag */
|
||||
#define SHFLG_CountChanges 0x00000020 /* .changes setting */
|
||||
#define SHFLG_Echo 0x00000040 /* .echo or --echo setting */
|
||||
#define SHFLG_HeaderSet 0x00000080 /* .header has been used */
|
||||
|
||||
/* ctype macros that work with signed characters */
|
||||
#define IsSpace(X) duckdb::StringUtil::CharacterIsSpace((unsigned char)X)
|
||||
#define IsDigit(X) isdigit((unsigned char)X)
|
||||
#define ToLower(X) (char)tolower((unsigned char)X)
|
||||
|
||||
/*
|
||||
** State information about the database connection is contained in an
|
||||
** instance of the following structure.
|
||||
*/
|
||||
struct ShellState {
|
||||
public:
|
||||
ShellState();
|
||||
|
||||
sqlite3 *db = nullptr; /* The database */
|
||||
uint8_t openMode = 0; /* SHELL_OPEN_NORMAL, _APPENDVFS, or _ZIPFILE */
|
||||
uint8_t doXdgOpen = 0; /* Invoke start/open/xdg-open in output_reset() */
|
||||
int outCount = 0; /* Revert to stdout when reaching zero */
|
||||
int lineno = 0; /* Line number of last line read from in */
|
||||
int openFlags = 0; /* Additional flags to open. (SQLITE_OPEN_NOFOLLOW) */
|
||||
FILE *in = nullptr; /* Read commands from this stream */
|
||||
FILE *out = nullptr; /* Write results here */
|
||||
int nErr = 0; /* Number of errors seen */
|
||||
RenderMode mode = RenderMode::LINE; /* An output mode setting */
|
||||
RenderMode modePrior = RenderMode::LINE; /* Saved mode */
|
||||
RenderMode cMode = RenderMode::LINE; /* temporary output mode for the current query */
|
||||
RenderMode normalMode = RenderMode::LINE; /* Output mode before ".explain on" */
|
||||
bool showHeader = false; /* True to show column names in List or Column mode */
|
||||
unsigned shellFlgs = 0; /* Various flags */
|
||||
unsigned priorShFlgs = 0; /* Saved copy of flags */
|
||||
int64_t szMax = 0; /* --maxsize argument to .open */
|
||||
char *zDestTable = nullptr; /* Name of destination table when RenderMode::Insert */
|
||||
char *zTempFile = nullptr; /* Temporary file that might need deleting */
|
||||
string colSeparator; /* Column separator character for several modes */
|
||||
string rowSeparator; /* Row separator character for RenderMode::Ascii */
|
||||
string colSepPrior; /* Saved column separator */
|
||||
string rowSepPrior; /* Saved row separator */
|
||||
vector<int> colWidth; /* Requested width of each column in columnar modes */
|
||||
string nullValue; /* The text to print when a NULL comes back from the database */
|
||||
int columns = 0; /* Column-wise DuckBox rendering */
|
||||
string outfile; /* Filename for *out */
|
||||
string zDbFilename; /* name of the database file */
|
||||
sqlite3_stmt *pStmt = nullptr; /* Current statement if any. */
|
||||
FILE *pLog = nullptr; /* Write log output here */
|
||||
size_t max_rows = 0; /* The maximum number of rows to render in DuckBox mode */
|
||||
size_t max_width = 0; /* The maximum number of characters to render horizontally in DuckBox mode */
|
||||
//! Decimal separator (if any)
|
||||
char decimal_separator = '\0';
|
||||
//! Thousand separator (if any)
|
||||
char thousand_separator = '\0';
|
||||
//! When to use formatting of large numbers (in DuckBox mode)
|
||||
LargeNumberRendering large_number_rendering = LargeNumberRendering::DEFAULT;
|
||||
//! The command to execute when `-ui` is passed in
|
||||
string ui_command = "CALL start_ui()";
|
||||
|
||||
public:
|
||||
void PushOutputMode();
|
||||
void PopOutputMode();
|
||||
void OutputCSV(const char *z, int bSep);
|
||||
void PrintRowSeparator(idx_t nArg, const char *zSep, const vector<idx_t> &actualWidth);
|
||||
void PrintMarkdownSeparator(idx_t nArg, const char *zSep, const vector<int> &colTypes,
|
||||
const vector<idx_t> &actualWidth);
|
||||
void OutputCString(const char *z);
|
||||
void OutputQuotedString(const char *z);
|
||||
void OutputQuotedEscapedString(const char *z);
|
||||
void OutputHexBlob(const void *pBlob, int nBlob);
|
||||
void PrintSchemaLine(const char *z, const char *zTail);
|
||||
void PrintSchemaLineN(char *z, int n, const char *zTail);
|
||||
void PrintOptionallyQuotedIdentifier(const char *z);
|
||||
bool IsNumber(const char *z, int *realnum);
|
||||
void OutputJSONString(const char *z, int n);
|
||||
void PrintDashes(idx_t N);
|
||||
void UTF8WidthPrint(FILE *pOut, idx_t w, const string &str, bool right_align);
|
||||
bool SetOutputMode(const char *mode, const char *tbl_name);
|
||||
bool ImportData(const char **azArg, idx_t nArg);
|
||||
bool OpenDatabase(const char **azArg, idx_t nArg);
|
||||
bool SetOutputFile(const char **azArg, idx_t nArg, char output_mode);
|
||||
bool ReadFromFile(const string &file);
|
||||
bool DisplaySchemas(const char **azArg, idx_t nArg);
|
||||
MetadataResult DisplayEntries(const char **azArg, idx_t nArg, char type);
|
||||
void ShowConfiguration();
|
||||
|
||||
idx_t RenderLength(const char *z);
|
||||
idx_t RenderLength(const string &str);
|
||||
void SetBinaryMode();
|
||||
void SetTextMode();
|
||||
static idx_t StringLength(const char *z);
|
||||
void SetTableName(const char *zName);
|
||||
int RunTableDumpQuery(const char *zSelect);
|
||||
void PrintValue(const char *str);
|
||||
void Print(PrintOutput output, const char *str);
|
||||
void Print(PrintOutput output, const string &str);
|
||||
void Print(const char *str);
|
||||
void Print(const string &str);
|
||||
void PrintPadded(const char *str, idx_t len);
|
||||
bool ColumnTypeIsInteger(const char *type);
|
||||
string strdup_handle_newline(const char *z);
|
||||
ColumnarResult ExecuteColumnar(sqlite3_stmt *pStmt);
|
||||
unique_ptr<ColumnRenderer> GetColumnRenderer();
|
||||
unique_ptr<RowRenderer> GetRowRenderer();
|
||||
unique_ptr<RowRenderer> GetRowRenderer(RenderMode mode);
|
||||
void ExecutePreparedStatementColumnar(sqlite3_stmt *pStmt);
|
||||
char **TableColumnList(const char *zTab);
|
||||
void ExecutePreparedStatement(sqlite3_stmt *pStmt);
|
||||
|
||||
void PrintDatabaseError(const char *zErr);
|
||||
int ShellDatabaseError(sqlite3 *db);
|
||||
int RunInitialCommand(char *sql, bool bail);
|
||||
|
||||
int RenderRow(RowRenderer &renderer, RowResult &result);
|
||||
|
||||
int ExecuteSQL(const char *zSql, /* SQL to be evaluated */
|
||||
char **pzErrMsg /* Error msg written here */
|
||||
);
|
||||
int RunSchemaDumpQuery(const char *zQuery);
|
||||
void OpenDB(int openFlags);
|
||||
|
||||
void SetOrClearFlag(unsigned mFlag, const char *zArg);
|
||||
bool ShellHasFlag(int flag) {
|
||||
return (shellFlgs & flag) != 0;
|
||||
}
|
||||
|
||||
void ShellSetFlag(int flag) {
|
||||
shellFlgs |= flag;
|
||||
}
|
||||
|
||||
void ShellClearFlag(int flag) {
|
||||
shellFlgs &= ~flag;
|
||||
}
|
||||
void ResetOutput();
|
||||
void ClearTempFile();
|
||||
void NewTempFile(const char *zSuffix);
|
||||
int DoMetaCommand(char *zLine);
|
||||
|
||||
int RunOneSqlLine(InputMode mode, char *zSql);
|
||||
string GetDefaultDuckDBRC();
|
||||
bool ProcessDuckDBRC(const char *file);
|
||||
bool ProcessFile(const string &file, bool is_duckdb_rc = false);
|
||||
int ProcessInput(InputMode mode);
|
||||
};
|
||||
|
||||
} // namespace duckdb_shell
|
||||
7
external/duckdb/tools/shell/linenoise/CMakeLists.txt
vendored
Normal file
7
external/duckdb/tools/shell/linenoise/CMakeLists.txt
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
include_directories(include)
|
||||
|
||||
add_library(duckdb_linenoise OBJECT highlighting.cpp history.cpp linenoise.cpp
|
||||
linenoise-c.cpp rendering.cpp terminal.cpp)
|
||||
set(SHELL_SOURCES
|
||||
${SHELL_SOURCES} $<TARGET_OBJECTS:duckdb_linenoise>
|
||||
PARENT_SCOPE)
|
||||
25
external/duckdb/tools/shell/linenoise/LICENSE
vendored
Normal file
25
external/duckdb/tools/shell/linenoise/LICENSE
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
Copyright (c) 2010-2014, Salvatore Sanfilippo <antirez at gmail dot com>
|
||||
Copyright (c) 2010-2013, Pieter Noordhuis <pcnoordhuis at gmail dot com>
|
||||
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright notice,
|
||||
this list of conditions and the following disclaimer.
|
||||
|
||||
* Redistributions in binary form must reproduce the above copyright notice,
|
||||
this list of conditions and the following disclaimer in the documentation
|
||||
and/or other materials provided with the distribution.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
|
||||
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
|
||||
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
270
external/duckdb/tools/shell/linenoise/highlighting.cpp
vendored
Normal file
270
external/duckdb/tools/shell/linenoise/highlighting.cpp
vendored
Normal file
@@ -0,0 +1,270 @@
|
||||
#include "linenoise.hpp"
|
||||
#include "linenoise.h"
|
||||
#include "highlighting.hpp"
|
||||
#include "duckdb/parser/parser.hpp"
|
||||
#include "duckdb/common/string.hpp"
|
||||
|
||||
#if defined(_WIN32) || defined(__WIN32__) || defined(WIN32)
|
||||
// disable highlighting on windows (for now?)
|
||||
#define DISABLE_HIGHLIGHT
|
||||
#endif
|
||||
|
||||
namespace duckdb {
|
||||
|
||||
#ifdef DISABLE_HIGHLIGHT
|
||||
static int enableHighlighting = 0;
|
||||
#else
|
||||
static int enableHighlighting = 1;
|
||||
#endif
|
||||
struct Color {
|
||||
const char *color_name;
|
||||
const char *highlight;
|
||||
};
|
||||
static Color terminal_colors[] = {{"red", "\033[31m"}, {"green", "\033[32m"},
|
||||
{"yellow", "\033[33m"}, {"blue", "\033[34m"},
|
||||
{"magenta", "\033[35m"}, {"cyan", "\033[36m"},
|
||||
{"white", "\033[37m"}, {"brightblack", "\033[90m"},
|
||||
{"brightred", "\033[91m"}, {"brightgreen", "\033[92m"},
|
||||
{"brightyellow", "\033[93m"}, {"brightblue", "\033[94m"},
|
||||
{"brightmagenta", "\033[95m"}, {"brightcyan", "\033[96m"},
|
||||
{"brightwhite", "\033[97m"}, {nullptr, nullptr}};
|
||||
static std::string bold = "\033[1m";
|
||||
static std::string underline = "\033[4m";
|
||||
static std::string keyword = "\033[32m";
|
||||
static std::string continuation_selected = "\033[32m";
|
||||
static std::string constant = "\033[33m";
|
||||
static std::string continuation = "\033[90m";
|
||||
static std::string comment = "\033[90m";
|
||||
static std::string error = "\033[31m";
|
||||
static std::string reset = "\033[00m";
|
||||
|
||||
void Highlighting::Enable() {
|
||||
enableHighlighting = 1;
|
||||
}
|
||||
|
||||
void Highlighting::Disable() {
|
||||
enableHighlighting = 0;
|
||||
}
|
||||
|
||||
bool Highlighting::IsEnabled() {
|
||||
return enableHighlighting;
|
||||
}
|
||||
|
||||
const char *Highlighting::GetColorOption(const char *option) {
|
||||
size_t index = 0;
|
||||
while (terminal_colors[index].color_name) {
|
||||
if (strcmp(terminal_colors[index].color_name, option) == 0) {
|
||||
return terminal_colors[index].highlight;
|
||||
}
|
||||
index++;
|
||||
}
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
void Highlighting::SetHighlightingColor(HighlightingType type, const char *color) {
|
||||
switch (type) {
|
||||
case HighlightingType::KEYWORD:
|
||||
keyword = color;
|
||||
break;
|
||||
case HighlightingType::CONSTANT:
|
||||
constant = color;
|
||||
break;
|
||||
case HighlightingType::COMMENT:
|
||||
comment = color;
|
||||
break;
|
||||
case HighlightingType::ERROR:
|
||||
error = color;
|
||||
break;
|
||||
case HighlightingType::CONTINUATION:
|
||||
continuation = color;
|
||||
break;
|
||||
case HighlightingType::CONTINUATION_SELECTED:
|
||||
continuation_selected = color;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
static tokenType convertToken(duckdb::SimplifiedTokenType token_type) {
|
||||
switch (token_type) {
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_IDENTIFIER:
|
||||
return tokenType::TOKEN_IDENTIFIER;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_NUMERIC_CONSTANT:
|
||||
return tokenType::TOKEN_NUMERIC_CONSTANT;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_STRING_CONSTANT:
|
||||
return tokenType::TOKEN_STRING_CONSTANT;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_OPERATOR:
|
||||
return tokenType::TOKEN_OPERATOR;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_KEYWORD:
|
||||
return tokenType::TOKEN_KEYWORD;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_COMMENT:
|
||||
return tokenType::TOKEN_COMMENT;
|
||||
default:
|
||||
throw duckdb::InternalException("Unrecognized token type");
|
||||
}
|
||||
}
|
||||
|
||||
static vector<highlightToken> GetParseTokens(char *buf, size_t len) {
|
||||
string sql(buf, len);
|
||||
auto parseTokens = duckdb::Parser::Tokenize(sql);
|
||||
|
||||
vector<highlightToken> tokens;
|
||||
for (auto &token : parseTokens) {
|
||||
highlightToken new_token;
|
||||
new_token.type = convertToken(token.type);
|
||||
new_token.start = token.start;
|
||||
tokens.push_back(new_token);
|
||||
}
|
||||
|
||||
if (!tokens.empty() && tokens[0].start > 0) {
|
||||
highlightToken new_token;
|
||||
new_token.type = tokenType::TOKEN_IDENTIFIER;
|
||||
new_token.start = 0;
|
||||
tokens.insert(tokens.begin(), new_token);
|
||||
}
|
||||
if (tokens.empty() && sql.size() > 0) {
|
||||
highlightToken new_token;
|
||||
new_token.type = tokenType::TOKEN_IDENTIFIER;
|
||||
new_token.start = 0;
|
||||
tokens.push_back(new_token);
|
||||
}
|
||||
return tokens;
|
||||
}
|
||||
|
||||
static vector<highlightToken> GetDotCommandTokens(char *buf, size_t len) {
|
||||
vector<highlightToken> tokens;
|
||||
|
||||
// identifier token for the dot command itself
|
||||
highlightToken dot_token;
|
||||
dot_token.type = tokenType::TOKEN_KEYWORD;
|
||||
dot_token.start = 0;
|
||||
tokens.push_back(dot_token);
|
||||
|
||||
for (idx_t i = 0; i + 1 < len; i++) {
|
||||
if (Linenoise::IsSpace(buf[i])) {
|
||||
highlightToken argument_token;
|
||||
argument_token.type = tokenType::TOKEN_STRING_CONSTANT;
|
||||
argument_token.start = i + 1;
|
||||
tokens.push_back(argument_token);
|
||||
}
|
||||
}
|
||||
return tokens;
|
||||
}
|
||||
|
||||
vector<highlightToken> Highlighting::Tokenize(char *buf, size_t len, bool is_dot_command, searchMatch *match) {
|
||||
vector<highlightToken> tokens;
|
||||
if (!is_dot_command) {
|
||||
// SQL query - use parser to obtain tokens
|
||||
tokens = GetParseTokens(buf, len);
|
||||
} else {
|
||||
// . command
|
||||
tokens = GetDotCommandTokens(buf, len);
|
||||
}
|
||||
if (match) {
|
||||
// we have a search match - insert it into the token list
|
||||
// we want to insert a search token with start = match_start, end = match_end
|
||||
// first figure out which token type we would have at match_end (if any)
|
||||
for (size_t i = 0; i + 1 < tokens.size(); i++) {
|
||||
if (tokens[i].start <= match->match_start && tokens[i + 1].start >= match->match_start) {
|
||||
// this token begins after the search position, insert the token here
|
||||
size_t token_position = i + 1;
|
||||
auto end_type = tokens[i].type;
|
||||
if (tokens[i].start == match->match_start) {
|
||||
// exact start: only set the search match
|
||||
tokens[i].search_match = true;
|
||||
} else {
|
||||
// non-exact start: add a new token
|
||||
highlightToken search_token;
|
||||
search_token.type = tokens[i].type;
|
||||
search_token.start = match->match_start;
|
||||
search_token.search_match = true;
|
||||
tokens.insert(tokens.begin() + token_position, search_token);
|
||||
token_position++;
|
||||
}
|
||||
|
||||
// move forwards
|
||||
while (token_position < tokens.size() && tokens[token_position].start < match->match_end) {
|
||||
// this token is
|
||||
// mark this token as a search token
|
||||
end_type = tokens[token_position].type;
|
||||
tokens[token_position].search_match = true;
|
||||
token_position++;
|
||||
}
|
||||
if (token_position >= tokens.size() || tokens[token_position].start > match->match_end) {
|
||||
// insert the token that marks the end of the search
|
||||
highlightToken end_token;
|
||||
end_token.type = end_type;
|
||||
end_token.start = match->match_end;
|
||||
tokens.insert(tokens.begin() + token_position, end_token);
|
||||
token_position++;
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
return tokens;
|
||||
}
|
||||
|
||||
string Highlighting::HighlightText(char *buf, size_t len, size_t start_pos, size_t end_pos,
|
||||
const vector<highlightToken> &tokens) {
|
||||
duckdb::stringstream ss;
|
||||
size_t prev_pos = 0;
|
||||
for (size_t i = 0; i < tokens.size(); i++) {
|
||||
size_t next = i + 1 < tokens.size() ? tokens[i + 1].start : len;
|
||||
if (next < start_pos) {
|
||||
// this token is not rendered at all
|
||||
continue;
|
||||
}
|
||||
|
||||
auto &token = tokens[i];
|
||||
size_t start = token.start > start_pos ? token.start : start_pos;
|
||||
size_t end = next > end_pos ? end_pos : next;
|
||||
if (end <= start) {
|
||||
continue;
|
||||
}
|
||||
if (prev_pos > start) {
|
||||
#ifdef DEBUG
|
||||
throw InternalException("ERROR - Rendering at position %llu after rendering at position %llu\n", start,
|
||||
prev_pos);
|
||||
#endif
|
||||
Linenoise::Log("ERROR - Rendering at position %llu after rendering at position %llu\n", start, prev_pos);
|
||||
continue;
|
||||
}
|
||||
prev_pos = start;
|
||||
std::string text = std::string(buf + start, end - start);
|
||||
if (token.search_match) {
|
||||
ss << underline;
|
||||
}
|
||||
switch (token.type) {
|
||||
case tokenType::TOKEN_KEYWORD:
|
||||
ss << keyword << text << reset;
|
||||
break;
|
||||
case tokenType::TOKEN_NUMERIC_CONSTANT:
|
||||
case tokenType::TOKEN_STRING_CONSTANT:
|
||||
ss << constant << text << reset;
|
||||
break;
|
||||
case tokenType::TOKEN_CONTINUATION:
|
||||
ss << continuation << text << reset;
|
||||
break;
|
||||
case tokenType::TOKEN_CONTINUATION_SELECTED:
|
||||
ss << continuation_selected << text << reset;
|
||||
break;
|
||||
case tokenType::TOKEN_BRACKET:
|
||||
ss << underline << text << reset;
|
||||
break;
|
||||
case tokenType::TOKEN_ERROR:
|
||||
ss << error << text << reset;
|
||||
break;
|
||||
case tokenType::TOKEN_COMMENT:
|
||||
ss << comment << text << reset;
|
||||
break;
|
||||
default:
|
||||
ss << text;
|
||||
if (token.search_match) {
|
||||
ss << reset;
|
||||
}
|
||||
}
|
||||
}
|
||||
return ss.str();
|
||||
}
|
||||
|
||||
} // namespace duckdb
|
||||
367
external/duckdb/tools/shell/linenoise/history.cpp
vendored
Normal file
367
external/duckdb/tools/shell/linenoise/history.cpp
vendored
Normal file
@@ -0,0 +1,367 @@
|
||||
#include "history.hpp"
|
||||
#include "linenoise.hpp"
|
||||
#include "terminal.hpp"
|
||||
#include "duckdb_shell_wrapper.h"
|
||||
#include "sqlite3.h"
|
||||
#include "utf8proc_wrapper.hpp"
|
||||
#include <sys/stat.h>
|
||||
|
||||
namespace duckdb {
|
||||
#define LINENOISE_DEFAULT_HISTORY_MAX_LEN 1000
|
||||
static idx_t history_max_len = LINENOISE_DEFAULT_HISTORY_MAX_LEN;
|
||||
static idx_t history_len = 0;
|
||||
static char **history = nullptr;
|
||||
static char *history_file = nullptr;
|
||||
|
||||
/* Free the history, but does not reset it. Only used when we have to
|
||||
* exit() to avoid memory leaks are reported by valgrind & co. */
|
||||
void History::Free() {
|
||||
if (history) {
|
||||
for (idx_t j = 0; j < history_len; j++) {
|
||||
free(history[j]);
|
||||
}
|
||||
free(history);
|
||||
}
|
||||
}
|
||||
|
||||
idx_t History::GetLength() {
|
||||
return history_len;
|
||||
}
|
||||
|
||||
const char *History::GetEntry(idx_t index) {
|
||||
if (!history || index >= history_len) {
|
||||
// FIXME: print debug message
|
||||
return "";
|
||||
}
|
||||
return history[index];
|
||||
}
|
||||
|
||||
void History::Overwrite(idx_t index, const char *new_entry) {
|
||||
if (!history || index >= history_len) {
|
||||
// FIXME: print debug message
|
||||
return;
|
||||
}
|
||||
|
||||
free(history[index]);
|
||||
history[index] = strdup(new_entry);
|
||||
}
|
||||
|
||||
void History::RemoveLastEntry() {
|
||||
history_len--;
|
||||
free(history[history_len]);
|
||||
}
|
||||
|
||||
int History::Add(const char *line) {
|
||||
return History::Add(line, strlen(line));
|
||||
}
|
||||
|
||||
int History::Add(const char *line, idx_t len) {
|
||||
char *linecopy;
|
||||
|
||||
if (history_max_len == 0) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* Initialization on first call. */
|
||||
if (history == nullptr) {
|
||||
history = (char **)malloc(sizeof(char *) * history_max_len);
|
||||
if (history == nullptr) {
|
||||
return 0;
|
||||
}
|
||||
memset(history, 0, (sizeof(char *) * history_max_len));
|
||||
}
|
||||
|
||||
/* Don't add duplicated lines. */
|
||||
if (history_len && !strcmp(history[history_len - 1], line)) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (!Utf8Proc::IsValid(line, len)) {
|
||||
// don't add invalid UTF8 to history
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* Add an heap allocated copy of the line in the history.
|
||||
* If we reached the max length, remove the older line. */
|
||||
if (!Terminal::IsMultiline()) {
|
||||
// replace all newlines with spaces
|
||||
linecopy = strdup(line);
|
||||
if (!linecopy) {
|
||||
return 0;
|
||||
}
|
||||
for (auto ptr = linecopy; *ptr; ptr++) {
|
||||
if (*ptr == '\n' || *ptr == '\r') {
|
||||
*ptr = ' ';
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// replace all '\n' with '\r\n'
|
||||
idx_t replaced_newline_count = 0;
|
||||
idx_t len;
|
||||
for (len = 0; line[len]; len++) {
|
||||
if (line[len] == '\r' && line[len + 1] == '\n') {
|
||||
// \r\n - skip past the \n
|
||||
len++;
|
||||
} else if (line[len] == '\n') {
|
||||
replaced_newline_count++;
|
||||
}
|
||||
}
|
||||
linecopy = (char *)malloc((len + replaced_newline_count + 1) * sizeof(char));
|
||||
idx_t pos = 0;
|
||||
for (len = 0; line[len]; len++) {
|
||||
if (line[len] == '\r' && line[len + 1] == '\n') {
|
||||
// \r\n - skip past the \n
|
||||
linecopy[pos++] = '\r';
|
||||
len++;
|
||||
} else if (line[len] == '\n') {
|
||||
linecopy[pos++] = '\r';
|
||||
}
|
||||
linecopy[pos++] = line[len];
|
||||
}
|
||||
linecopy[pos] = '\0';
|
||||
}
|
||||
if (history_len == history_max_len) {
|
||||
free(history[0]);
|
||||
memmove(history, history + 1, sizeof(char *) * (history_max_len - 1));
|
||||
history_len--;
|
||||
}
|
||||
history[history_len] = linecopy;
|
||||
history_len++;
|
||||
if (history_file && strlen(line) > 0) {
|
||||
// if there is a history file that we loaded from
|
||||
// append to the history
|
||||
// this way we can recover history in case of a crash
|
||||
FILE *fp;
|
||||
|
||||
fp = fopen(history_file, "a");
|
||||
if (fp == nullptr) {
|
||||
return 1;
|
||||
}
|
||||
fprintf(fp, "%s\n", line);
|
||||
fclose(fp);
|
||||
}
|
||||
return 1;
|
||||
}
|
||||
|
||||
int History::SetMaxLength(idx_t len) {
|
||||
char **new_entry;
|
||||
|
||||
if (len < 1) {
|
||||
return 0;
|
||||
}
|
||||
if (history) {
|
||||
idx_t tocopy = history_len;
|
||||
|
||||
new_entry = (char **)malloc(sizeof(char *) * len);
|
||||
if (new_entry == nullptr) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* If we can't copy everything, free the elements we'll not use. */
|
||||
if (len < tocopy) {
|
||||
for (idx_t j = 0; j < tocopy - len; j++) {
|
||||
free(history[j]);
|
||||
}
|
||||
tocopy = len;
|
||||
}
|
||||
memset(new_entry, 0, sizeof(char *) * len);
|
||||
memcpy(new_entry, history + (history_len - tocopy), sizeof(char *) * tocopy);
|
||||
free(history);
|
||||
history = new_entry;
|
||||
}
|
||||
history_max_len = len;
|
||||
if (history_len > history_max_len) {
|
||||
history_len = history_max_len;
|
||||
}
|
||||
return 1;
|
||||
}
|
||||
|
||||
int History::Save(const char *filename) {
|
||||
mode_t old_umask = umask(S_IXUSR | S_IRWXG | S_IRWXO);
|
||||
FILE *fp;
|
||||
|
||||
fp = fopen(filename, "w");
|
||||
umask(old_umask);
|
||||
if (fp == nullptr) {
|
||||
return -1;
|
||||
}
|
||||
chmod(filename, S_IRUSR | S_IWUSR);
|
||||
for (idx_t j = 0; j < history_len; j++) {
|
||||
fprintf(fp, "%s\n", history[j]);
|
||||
}
|
||||
fclose(fp);
|
||||
return 0;
|
||||
}
|
||||
|
||||
struct LineReader {
|
||||
static constexpr idx_t LINE_BUFFER_SIZE = LINENOISE_MAX_LINE * 2ULL;
|
||||
|
||||
public:
|
||||
LineReader() : fp(nullptr), filename(nullptr), end_of_file(false), position(0), capacity(0), total_read(0) {
|
||||
line_buffer[LINENOISE_MAX_LINE] = '\0';
|
||||
data_buffer[LINE_BUFFER_SIZE] = '\0';
|
||||
}
|
||||
|
||||
bool Init(const char *filename_p) {
|
||||
filename = filename_p;
|
||||
fp = fopen(filename, "r");
|
||||
return fp;
|
||||
}
|
||||
|
||||
void Close() {
|
||||
if (fp) {
|
||||
fclose(fp);
|
||||
fp = nullptr;
|
||||
}
|
||||
}
|
||||
|
||||
const char *GetLine() {
|
||||
return line_buffer;
|
||||
}
|
||||
|
||||
idx_t GetNextNewline() {
|
||||
for (idx_t i = position; i < capacity; i++) {
|
||||
if (data_buffer[i] == '\r' || data_buffer[i] == '\n') {
|
||||
return i;
|
||||
}
|
||||
}
|
||||
return capacity;
|
||||
}
|
||||
|
||||
void SkipNewline() {
|
||||
if (position >= capacity) {
|
||||
// we are already at the end - fill the buffer
|
||||
FillBuffer();
|
||||
}
|
||||
if (position < capacity && data_buffer[position] == '\n') {
|
||||
position++;
|
||||
}
|
||||
}
|
||||
|
||||
bool NextLine() {
|
||||
idx_t line_size = 0;
|
||||
while (true) {
|
||||
// find the next newline in the current buffer (if any)
|
||||
idx_t i = GetNextNewline();
|
||||
// copy over the data and move to the next byte
|
||||
idx_t read_count = i - position;
|
||||
if (line_size + read_count > LINENOISE_MAX_LINE) {
|
||||
// exceeded max line size
|
||||
// move on to next line and don't add to history
|
||||
// skip to next newline
|
||||
bool found_next_newline = false;
|
||||
while (!found_next_newline && capacity > 0) {
|
||||
i = GetNextNewline();
|
||||
if (i < capacity) {
|
||||
found_next_newline = true;
|
||||
}
|
||||
if (!found_next_newline) {
|
||||
// read more data
|
||||
FillBuffer();
|
||||
}
|
||||
}
|
||||
if (!found_next_newline) {
|
||||
// no newline found - skip
|
||||
return false;
|
||||
}
|
||||
// newline found - adjust position and read next line
|
||||
position = i + 1;
|
||||
if (data_buffer[i] == '\r') {
|
||||
// \r\n - skip the next byte as well
|
||||
SkipNewline();
|
||||
}
|
||||
continue;
|
||||
}
|
||||
memcpy(line_buffer + line_size, data_buffer + position, read_count);
|
||||
line_size += read_count;
|
||||
|
||||
if (i < capacity) {
|
||||
// we're still within the buffer - this means we found a newline in the buffer
|
||||
line_buffer[line_size] = '\0';
|
||||
position = i + 1;
|
||||
if (data_buffer[i] == '\r') {
|
||||
// \r\n - skip the next byte as well
|
||||
SkipNewline();
|
||||
}
|
||||
if (line_size == 0 || !Utf8Proc::IsValid(line_buffer, line_size)) {
|
||||
// line is empty OR not valid UTF8
|
||||
// move on to next line and don't add to history
|
||||
line_size = 0;
|
||||
continue;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
// we need to read more data - fill up the buffer
|
||||
FillBuffer();
|
||||
if (capacity == 0) {
|
||||
// no more data available - return true if there is anything we copied over (i.e. part of the next line)
|
||||
return line_size > 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void FillBuffer() {
|
||||
if (end_of_file || !fp) {
|
||||
return;
|
||||
}
|
||||
size_t read_data = fread(data_buffer, 1, LINE_BUFFER_SIZE, fp);
|
||||
position = 0;
|
||||
capacity = read_data;
|
||||
total_read += read_data;
|
||||
data_buffer[read_data] = '\0';
|
||||
|
||||
if (read_data == 0) {
|
||||
end_of_file = true;
|
||||
}
|
||||
if (total_read >= LINENOISE_MAX_HISTORY) {
|
||||
fprintf(stderr, "History file \"%s\" exceeds maximum history file size of %d MB - skipping full load\n",
|
||||
filename, LINENOISE_MAX_HISTORY / 1024 / 1024);
|
||||
capacity = 0;
|
||||
end_of_file = true;
|
||||
}
|
||||
}
|
||||
|
||||
private:
|
||||
FILE *fp;
|
||||
const char *filename;
|
||||
char line_buffer[LINENOISE_MAX_LINE + 1];
|
||||
char data_buffer[LINE_BUFFER_SIZE + 1];
|
||||
bool end_of_file;
|
||||
idx_t position;
|
||||
idx_t capacity;
|
||||
idx_t total_read;
|
||||
};
|
||||
|
||||
int History::Load(const char *filename) {
|
||||
LineReader reader;
|
||||
if (!reader.Init(filename)) {
|
||||
return -1;
|
||||
}
|
||||
|
||||
std::string result;
|
||||
while (reader.NextLine()) {
|
||||
auto buf = reader.GetLine();
|
||||
if (result.empty() && buf[0] == '.') {
|
||||
// if the first character is a dot this is a dot command
|
||||
// add the full line to the history
|
||||
History::Add(buf);
|
||||
continue;
|
||||
}
|
||||
// else we are parsing a SQL statement
|
||||
result += buf;
|
||||
if (sqlite3_complete(result.c_str())) {
|
||||
// this line contains a full SQL statement - add it to the history
|
||||
History::Add(result.c_str(), result.size());
|
||||
result = std::string();
|
||||
continue;
|
||||
}
|
||||
// the result does not contain a full SQL statement - add a newline deliminator and move on to the next line
|
||||
result += "\r\n";
|
||||
}
|
||||
reader.Close();
|
||||
|
||||
history_file = strdup(filename);
|
||||
return 0;
|
||||
}
|
||||
|
||||
} // namespace duckdb
|
||||
50
external/duckdb/tools/shell/linenoise/include/highlighting.hpp
vendored
Normal file
50
external/duckdb/tools/shell/linenoise/include/highlighting.hpp
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// highlighting.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "duckdb/common/common.hpp"
|
||||
|
||||
namespace duckdb {
|
||||
struct searchMatch;
|
||||
|
||||
enum class tokenType : uint8_t {
|
||||
TOKEN_IDENTIFIER,
|
||||
TOKEN_NUMERIC_CONSTANT,
|
||||
TOKEN_STRING_CONSTANT,
|
||||
TOKEN_OPERATOR,
|
||||
TOKEN_KEYWORD,
|
||||
TOKEN_COMMENT,
|
||||
TOKEN_CONTINUATION,
|
||||
TOKEN_CONTINUATION_SELECTED,
|
||||
TOKEN_BRACKET,
|
||||
TOKEN_ERROR
|
||||
};
|
||||
|
||||
enum class HighlightingType { KEYWORD, CONSTANT, COMMENT, ERROR, CONTINUATION, CONTINUATION_SELECTED };
|
||||
|
||||
struct highlightToken {
|
||||
tokenType type;
|
||||
size_t start = 0;
|
||||
bool search_match = false;
|
||||
};
|
||||
|
||||
class Highlighting {
|
||||
public:
|
||||
static void Enable();
|
||||
static void Disable();
|
||||
static bool IsEnabled();
|
||||
static const char *GetColorOption(const char *option);
|
||||
static void SetHighlightingColor(HighlightingType type, const char *color);
|
||||
|
||||
static vector<highlightToken> Tokenize(char *buf, size_t len, bool is_dot_command, searchMatch *match);
|
||||
static string HighlightText(char *buf, size_t len, size_t start_pos, size_t end_pos,
|
||||
const vector<highlightToken> &tokens);
|
||||
};
|
||||
|
||||
} // namespace duckdb
|
||||
29
external/duckdb/tools/shell/linenoise/include/history.hpp
vendored
Normal file
29
external/duckdb/tools/shell/linenoise/include/history.hpp
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// history.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "duckdb/common/common.hpp"
|
||||
|
||||
namespace duckdb {
|
||||
|
||||
class History {
|
||||
public:
|
||||
static void Free();
|
||||
static idx_t GetLength();
|
||||
static const char *GetEntry(idx_t index);
|
||||
static void Overwrite(idx_t index, const char *new_entry);
|
||||
static void RemoveLastEntry();
|
||||
static int Add(const char *line);
|
||||
static int Add(const char *line, idx_t len);
|
||||
static int SetMaxLength(idx_t len);
|
||||
static int Save(const char *filename);
|
||||
static int Load(const char *filename);
|
||||
};
|
||||
|
||||
} // namespace duckdb
|
||||
76
external/duckdb/tools/shell/linenoise/include/linenoise.h
vendored
Normal file
76
external/duckdb/tools/shell/linenoise/include/linenoise.h
vendored
Normal file
@@ -0,0 +1,76 @@
|
||||
/* linenoise.h -- VERSION 1.0
|
||||
*
|
||||
* Guerrilla line editing library against the idea that a line editing lib
|
||||
* needs to be 20,000 lines of C code.
|
||||
*
|
||||
* See linenoise.c for more information.
|
||||
*
|
||||
* ------------------------------------------------------------------------
|
||||
*
|
||||
* Copyright (c) 2010-2014, Salvatore Sanfilippo <antirez at gmail dot com>
|
||||
* Copyright (c) 2010-2013, Pieter Noordhuis <pcnoordhuis at gmail dot com>
|
||||
*
|
||||
* All rights reserved.
|
||||
*
|
||||
* Redistribution and use in source and binary forms, with or without
|
||||
* modification, are permitted provided that the following conditions are
|
||||
* met:
|
||||
*
|
||||
* * Redistributions of source code must retain the above copyright
|
||||
* notice, this list of conditions and the following disclaimer.
|
||||
*
|
||||
* * Redistributions in binary form must reproduce the above copyright
|
||||
* notice, this list of conditions and the following disclaimer in the
|
||||
* documentation and/or other materials provided with the distribution.
|
||||
*
|
||||
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
* A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
* HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
*/
|
||||
|
||||
#ifndef __LINENOISE_H
|
||||
#define __LINENOISE_H
|
||||
|
||||
#ifdef __cplusplus
|
||||
extern "C" {
|
||||
#endif
|
||||
|
||||
typedef struct linenoiseCompletions {
|
||||
size_t len;
|
||||
char **cvec;
|
||||
} linenoiseCompletions;
|
||||
|
||||
typedef void(linenoiseCompletionCallback)(const char *, linenoiseCompletions *);
|
||||
typedef char *(linenoiseHintsCallback)(const char *, int *color, int *bold);
|
||||
typedef void(linenoiseFreeHintsCallback)(void *);
|
||||
void linenoiseSetCompletionCallback(linenoiseCompletionCallback *);
|
||||
void linenoiseSetHintsCallback(linenoiseHintsCallback *);
|
||||
void linenoiseSetFreeHintsCallback(linenoiseFreeHintsCallback *);
|
||||
void linenoiseAddCompletion(linenoiseCompletions *, const char *);
|
||||
|
||||
char *linenoise(const char *prompt);
|
||||
void linenoiseFree(void *ptr);
|
||||
int linenoiseParseOption(const char **azArg, int nArg, const char **out_error);
|
||||
int linenoiseHistoryAdd(const char *line);
|
||||
int linenoiseHistorySetMaxLen(int len);
|
||||
int linenoiseHistorySave(const char *filename);
|
||||
int linenoiseHistoryLoad(const char *filename);
|
||||
void linenoiseClearScreen(void);
|
||||
void linenoiseSetMultiLine(int ml);
|
||||
size_t linenoiseComputeRenderWidth(const char *buf, size_t len);
|
||||
int linenoiseGetRenderPosition(const char *buf, size_t len, int max_width, int *n);
|
||||
void linenoiseSetPrompt(const char *continuation, const char *continuationSelected);
|
||||
|
||||
#ifdef __cplusplus
|
||||
}
|
||||
#endif
|
||||
|
||||
#endif /* __LINENOISE_H */
|
||||
190
external/duckdb/tools/shell/linenoise/include/linenoise.hpp
vendored
Normal file
190
external/duckdb/tools/shell/linenoise/include/linenoise.hpp
vendored
Normal file
@@ -0,0 +1,190 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// linenoise.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "duckdb/common/common.hpp"
|
||||
#include "duckdb/common/exception.hpp"
|
||||
#include "terminal.hpp"
|
||||
#include "linenoise.h"
|
||||
|
||||
#define LINENOISE_MAX_LINE 204800
|
||||
#define LINENOISE_MAX_HISTORY 104857600
|
||||
#define LINENOISE_EDITOR
|
||||
|
||||
namespace duckdb {
|
||||
struct highlightToken;
|
||||
struct AppendBuffer;
|
||||
|
||||
enum class HistoryScrollDirection : uint8_t {
|
||||
LINENOISE_HISTORY_NEXT,
|
||||
LINENOISE_HISTORY_PREV,
|
||||
LINENOISE_HISTORY_START,
|
||||
LINENOISE_HISTORY_END
|
||||
};
|
||||
enum class Capitalization : uint8_t { CAPITALIZE, LOWERCASE, UPPERCASE };
|
||||
|
||||
struct searchMatch {
|
||||
size_t history_index;
|
||||
size_t match_start;
|
||||
size_t match_end;
|
||||
};
|
||||
|
||||
struct Completion {
|
||||
string completion;
|
||||
idx_t cursor_pos;
|
||||
};
|
||||
|
||||
struct TabCompletion {
|
||||
vector<Completion> completions;
|
||||
};
|
||||
|
||||
class Linenoise {
|
||||
public:
|
||||
Linenoise(int stdin_fd, int stdout_fd, char *buf, size_t buflen, const char *prompt);
|
||||
|
||||
public:
|
||||
int Edit();
|
||||
|
||||
static void SetCompletionCallback(linenoiseCompletionCallback *fn);
|
||||
static void SetHintsCallback(linenoiseHintsCallback *fn);
|
||||
static void SetFreeHintsCallback(linenoiseFreeHintsCallback *fn);
|
||||
|
||||
static linenoiseHintsCallback *HintsCallback();
|
||||
static linenoiseFreeHintsCallback *FreeHintsCallback();
|
||||
|
||||
static void SetPrompt(const char *continuation, const char *continuationSelected);
|
||||
static size_t ComputeRenderWidth(const char *buf, size_t len);
|
||||
static int GetRenderPosition(const char *buf, size_t len, int max_width, int *n);
|
||||
|
||||
static int ParseOption(const char **azArg, int nArg, const char **out_error);
|
||||
|
||||
int GetPromptWidth() const;
|
||||
|
||||
void RefreshLine();
|
||||
int CompleteLine(EscapeSequence ¤t_sequence);
|
||||
void InsertCharacter(char c);
|
||||
int EditInsert(char c);
|
||||
int EditInsertMulti(const char *c);
|
||||
void EditMoveLeft();
|
||||
void EditMoveRight();
|
||||
void EditMoveWordLeft();
|
||||
void EditMoveWordRight();
|
||||
bool EditMoveRowUp();
|
||||
bool EditMoveRowDown();
|
||||
void EditMoveHome();
|
||||
void EditMoveEnd();
|
||||
void EditMoveStartOfLine();
|
||||
void EditMoveEndOfLine();
|
||||
void EditHistoryNext(HistoryScrollDirection dir);
|
||||
void EditHistorySetIndex(idx_t index);
|
||||
void EditDelete();
|
||||
void EditBackspace();
|
||||
void EditDeletePrevWord();
|
||||
void EditDeleteNextWord();
|
||||
void EditDeleteAll();
|
||||
void EditCapitalizeNextWord(Capitalization capitalization);
|
||||
void EditRemoveSpaces();
|
||||
void EditSwapCharacter();
|
||||
void EditSwapWord();
|
||||
|
||||
void StartSearch();
|
||||
void CancelSearch();
|
||||
char AcceptSearch(char nextCommand);
|
||||
void PerformSearch();
|
||||
void SearchPrev();
|
||||
void SearchNext();
|
||||
|
||||
#ifdef LINENOISE_EDITOR
|
||||
bool EditBufferWithEditor(const char *editor);
|
||||
bool EditFileWithEditor(const string &file_name, const char *editor);
|
||||
#endif
|
||||
|
||||
char Search(char c);
|
||||
|
||||
void RefreshMultiLine();
|
||||
void RefreshSingleLine() const;
|
||||
void RefreshSearch();
|
||||
void RefreshShowHints(AppendBuffer &append_buffer, int plen) const;
|
||||
|
||||
size_t PrevChar() const;
|
||||
size_t NextChar() const;
|
||||
|
||||
void NextPosition(const char *buf, size_t len, size_t &cpos, int &rows, int &cols, int plen) const;
|
||||
void PositionToColAndRow(size_t target_pos, int &out_row, int &out_col, int &rows, int &cols) const;
|
||||
size_t ColAndRowToPosition(int target_row, int target_col) const;
|
||||
|
||||
string AddContinuationMarkers(const char *buf, size_t len, int plen, int cursor_row,
|
||||
vector<highlightToken> &tokens) const;
|
||||
void AddErrorHighlighting(idx_t render_start, idx_t render_end, vector<highlightToken> &tokens) const;
|
||||
|
||||
bool AddCompletionMarker(const char *buf, idx_t len, string &result_buffer, vector<highlightToken> &tokens) const;
|
||||
|
||||
static bool IsNewline(char c);
|
||||
static bool IsWordBoundary(char c);
|
||||
static bool AllWhitespace(const char *z);
|
||||
static bool IsSpace(char c);
|
||||
|
||||
TabCompletion TabComplete() const;
|
||||
|
||||
static void EnableCompletionRendering();
|
||||
static void DisableCompletionRendering();
|
||||
static void EnableErrorRendering();
|
||||
static void DisableErrorRendering();
|
||||
|
||||
public:
|
||||
static void LogTokens(const vector<highlightToken> &tokens);
|
||||
#ifdef LINENOISE_LOGGING
|
||||
// Logging
|
||||
template <typename... Args>
|
||||
static void Log(const string &msg, Args... params) {
|
||||
std::vector<ExceptionFormatValue> values;
|
||||
LogMessageRecursive(msg, values, params...);
|
||||
}
|
||||
|
||||
static void LogMessageRecursive(const string &msg, std::vector<ExceptionFormatValue> &values);
|
||||
|
||||
template <class T, typename... Args>
|
||||
static void LogMessageRecursive(const string &msg, std::vector<ExceptionFormatValue> &values, T param,
|
||||
Args... params) {
|
||||
values.push_back(ExceptionFormatValue::CreateFormatValue<T>(param));
|
||||
LogMessageRecursive(msg, values, params...);
|
||||
}
|
||||
#else
|
||||
template <typename... Args>
|
||||
static void Log(const string &msg, Args... params) {
|
||||
// nop
|
||||
}
|
||||
#endif
|
||||
|
||||
public:
|
||||
int ifd; /* Terminal stdin file descriptor. */
|
||||
int ofd; /* Terminal stdout file descriptor. */
|
||||
char *buf; /* Edited line buffer. */
|
||||
size_t buflen; /* Edited line buffer size. */
|
||||
const char *prompt; /* Prompt to display. */
|
||||
size_t plen; /* Prompt length. */
|
||||
size_t pos; /* Current cursor position. */
|
||||
size_t old_cursor_rows; /* Previous refresh cursor position. */
|
||||
size_t len; /* Current edited line length. */
|
||||
size_t y_scroll; /* The y scroll position (multiline mode) */
|
||||
TerminalSize ws; /* Terminal size */
|
||||
size_t maxrows; /* Maximum num of rows used so far (multiline mode) */
|
||||
idx_t history_index; /* The history index we are currently editing. */
|
||||
bool clear_screen; /* Whether we are clearing the screen */
|
||||
bool continuation_markers; /* Whether or not to render continuation markers */
|
||||
bool search; /* Whether or not we are searching our history */
|
||||
bool render; /* Whether or not to re-render */
|
||||
bool has_more_data; /* Whether or not there is more data available in the buffer (copy+paste)*/
|
||||
bool insert; /* Whether or not the last action was inserting a new character */
|
||||
std::string search_buf; //! The search buffer
|
||||
std::vector<searchMatch> search_matches; //! The set of search matches in our history
|
||||
size_t search_index; //! The current match index
|
||||
};
|
||||
|
||||
} // namespace duckdb
|
||||
122
external/duckdb/tools/shell/linenoise/include/terminal.hpp
vendored
Normal file
122
external/duckdb/tools/shell/linenoise/include/terminal.hpp
vendored
Normal file
@@ -0,0 +1,122 @@
|
||||
//===----------------------------------------------------------------------===//
|
||||
// DuckDB
|
||||
//
|
||||
// terminal.hpp
|
||||
//
|
||||
//
|
||||
//===----------------------------------------------------------------------===//
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "duckdb/common/common.hpp"
|
||||
|
||||
namespace duckdb {
|
||||
|
||||
enum KEY_ACTION {
|
||||
KEY_NULL = 0, /* NULL */
|
||||
CTRL_A = 1, /* Ctrl+a */
|
||||
CTRL_B = 2, /* Ctrl-b */
|
||||
CTRL_C = 3, /* Ctrl-c */
|
||||
CTRL_D = 4, /* Ctrl-d */
|
||||
CTRL_E = 5, /* Ctrl-e */
|
||||
CTRL_F = 6, /* Ctrl-f */
|
||||
CTRL_G = 7, /* Ctrl-g */
|
||||
CTRL_H = 8, /* Ctrl-h */
|
||||
TAB = 9, /* Tab */
|
||||
CTRL_J = 10, /* Ctrl+j*/
|
||||
CTRL_K = 11, /* Ctrl+k */
|
||||
CTRL_L = 12, /* Ctrl+l */
|
||||
ENTER = 13, /* Enter */
|
||||
CTRL_N = 14, /* Ctrl-n */
|
||||
CTRL_O = 15, /* Ctrl-O */
|
||||
CTRL_P = 16, /* Ctrl-p */
|
||||
CTRL_R = 18, /* Ctrl-r */
|
||||
CTRL_S = 19, /* Ctrl-s */
|
||||
CTRL_T = 20, /* Ctrl-t */
|
||||
CTRL_U = 21, /* Ctrl+u */
|
||||
CTRL_W = 23, /* Ctrl+w */
|
||||
CTRL_X = 24, /* Ctrl+x */
|
||||
CTRL_Y = 25, /* Ctrl+y */
|
||||
CTRL_Z = 26, /* Ctrl+z */
|
||||
ESC = 27, /* Escape */
|
||||
BACKSPACE = 127 /* Backspace */
|
||||
};
|
||||
|
||||
enum class EscapeSequence {
|
||||
INVALID = 0,
|
||||
UNKNOWN = 1,
|
||||
CTRL_MOVE_BACKWARDS,
|
||||
CTRL_MOVE_FORWARDS,
|
||||
HOME,
|
||||
END,
|
||||
UP,
|
||||
DOWN,
|
||||
RIGHT,
|
||||
LEFT,
|
||||
DELETE,
|
||||
SHIFT_TAB,
|
||||
ESCAPE,
|
||||
ALT_A,
|
||||
ALT_B,
|
||||
ALT_C,
|
||||
ALT_D,
|
||||
ALT_E,
|
||||
ALT_F,
|
||||
ALT_G,
|
||||
ALT_H,
|
||||
ALT_I,
|
||||
ALT_J,
|
||||
ALT_K,
|
||||
ALT_L,
|
||||
ALT_M,
|
||||
ALT_N,
|
||||
ALT_O,
|
||||
ALT_P,
|
||||
ALT_Q,
|
||||
ALT_R,
|
||||
ALT_S,
|
||||
ALT_T,
|
||||
ALT_U,
|
||||
ALT_V,
|
||||
ALT_W,
|
||||
ALT_X,
|
||||
ALT_Y,
|
||||
ALT_Z,
|
||||
ALT_BACKSPACE,
|
||||
ALT_LEFT_ARROW,
|
||||
ALT_RIGHT_ARROW,
|
||||
ALT_BACKSLASH,
|
||||
};
|
||||
|
||||
struct TerminalSize {
|
||||
int ws_col = 0;
|
||||
int ws_row = 0;
|
||||
};
|
||||
|
||||
class Terminal {
|
||||
public:
|
||||
static int IsUnsupportedTerm();
|
||||
static int EnableRawMode();
|
||||
static void DisableRawMode();
|
||||
static bool IsMultiline();
|
||||
static void SetMultiLine(int ml);
|
||||
|
||||
static void ClearScreen();
|
||||
static void Beep();
|
||||
|
||||
static bool IsAtty();
|
||||
static int HasMoreData(int fd);
|
||||
static TerminalSize GetTerminalSize();
|
||||
|
||||
static char *EditNoTTY();
|
||||
static int EditRaw(char *buf, size_t buflen, const char *prompt);
|
||||
|
||||
static EscapeSequence ReadEscapeSequence(int ifd);
|
||||
|
||||
private:
|
||||
static TerminalSize TryMeasureTerminalSize();
|
||||
static TerminalSize GetCursorPosition();
|
||||
static idx_t ReadEscapeSequence(int ifd, char sequence[]);
|
||||
};
|
||||
|
||||
} // namespace duckdb
|
||||
154
external/duckdb/tools/shell/linenoise/linenoise-c.cpp
vendored
Normal file
154
external/duckdb/tools/shell/linenoise/linenoise-c.cpp
vendored
Normal file
@@ -0,0 +1,154 @@
|
||||
#include "linenoise.hpp"
|
||||
#include "linenoise.h"
|
||||
#include "history.hpp"
|
||||
#include "terminal.hpp"
|
||||
|
||||
using duckdb::History;
|
||||
using duckdb::idx_t;
|
||||
using duckdb::Linenoise;
|
||||
using duckdb::Terminal;
|
||||
|
||||
/* The high level function that is the main API of the linenoise library.
|
||||
* This function checks if the terminal has basic capabilities, just checking
|
||||
* for a blacklist of stupid terminals, and later either calls the line
|
||||
* editing function or uses dummy fgets() so that you will be able to type
|
||||
* something even in the most desperate of the conditions. */
|
||||
char *linenoise(const char *prompt) {
|
||||
char buf[LINENOISE_MAX_LINE];
|
||||
int count;
|
||||
|
||||
if (!Terminal::IsAtty()) {
|
||||
/* Not a tty: read from file / pipe. In this mode we don't want any
|
||||
* limit to the line size, so we call a function to handle that. */
|
||||
return Terminal::EditNoTTY();
|
||||
} else if (Terminal::IsUnsupportedTerm()) {
|
||||
size_t len;
|
||||
|
||||
printf("%s", prompt);
|
||||
fflush(stdout);
|
||||
if (fgets(buf, LINENOISE_MAX_LINE, stdin) == NULL) {
|
||||
return NULL;
|
||||
}
|
||||
len = strlen(buf);
|
||||
while (len && (buf[len - 1] == '\n' || buf[len - 1] == '\r')) {
|
||||
len--;
|
||||
buf[len] = '\0';
|
||||
}
|
||||
return strdup(buf);
|
||||
} else {
|
||||
count = Terminal::EditRaw(buf, LINENOISE_MAX_LINE, prompt);
|
||||
if (count == -1) {
|
||||
return NULL;
|
||||
}
|
||||
return strdup(buf);
|
||||
}
|
||||
}
|
||||
|
||||
/* This is just a wrapper the user may want to call in order to make sure
|
||||
* the linenoise returned buffer is freed with the same allocator it was
|
||||
* created with. Useful when the main program is using an alternative
|
||||
* allocator. */
|
||||
void linenoiseFree(void *ptr) {
|
||||
free(ptr);
|
||||
}
|
||||
|
||||
/* ================================ History ================================= */
|
||||
|
||||
/* This is the API call to add a new entry in the linenoise history.
|
||||
* It uses a fixed array of char pointers that are shifted (memmoved)
|
||||
* when the history max length is reached in order to remove the older
|
||||
* entry and make room for the new one, so it is not exactly suitable for huge
|
||||
* histories, but will work well for a few hundred of entries.
|
||||
*
|
||||
* Using a circular buffer is smarter, but a bit more complex to handle. */
|
||||
int linenoiseHistoryAdd(const char *line) {
|
||||
return History::Add(line);
|
||||
}
|
||||
|
||||
/* Set the maximum length for the history. This function can be called even
|
||||
* if there is already some history, the function will make sure to retain
|
||||
* just the latest 'len' elements if the new history length value is smaller
|
||||
* than the amount of items already inside the history. */
|
||||
int linenoiseHistorySetMaxLen(int len) {
|
||||
if (len < 0) {
|
||||
return 0;
|
||||
}
|
||||
return History::SetMaxLength(idx_t(len));
|
||||
}
|
||||
|
||||
/* Save the history in the specified file. On success 0 is returned
|
||||
* otherwise -1 is returned. */
|
||||
int linenoiseHistorySave(const char *filename) {
|
||||
return History::Save(filename);
|
||||
}
|
||||
|
||||
/* Load the history from the specified file. If the file does not exist
|
||||
* zero is returned and no operation is performed.
|
||||
*
|
||||
* If the file exists and the operation succeeded 0 is returned, otherwise
|
||||
* on error -1 is returned. */
|
||||
int linenoiseHistoryLoad(const char *filename) {
|
||||
return History::Load(filename);
|
||||
}
|
||||
|
||||
/* Register a callback function to be called for tab-completion. */
|
||||
void linenoiseSetCompletionCallback(linenoiseCompletionCallback *fn) {
|
||||
Linenoise::SetCompletionCallback(fn);
|
||||
}
|
||||
|
||||
/* Register a hits function to be called to show hits to the user at the
|
||||
* right of the prompt. */
|
||||
void linenoiseSetHintsCallback(linenoiseHintsCallback *fn) {
|
||||
Linenoise::SetHintsCallback(fn);
|
||||
}
|
||||
|
||||
/* Register a function to free the hints returned by the hints callback
|
||||
* registered with linenoiseSetHintsCallback(). */
|
||||
void linenoiseSetFreeHintsCallback(linenoiseFreeHintsCallback *fn) {
|
||||
Linenoise::SetFreeHintsCallback(fn);
|
||||
}
|
||||
|
||||
void linenoiseSetMultiLine(int ml) {
|
||||
Terminal::SetMultiLine(ml);
|
||||
}
|
||||
|
||||
void linenoiseSetPrompt(const char *continuation, const char *continuationSelected) {
|
||||
Linenoise::SetPrompt(continuation, continuationSelected);
|
||||
}
|
||||
|
||||
/* This function is used by the callback function registered by the user
|
||||
* in order to add completion options given the input string when the
|
||||
* user typed <tab>. See the example.c source code for a very easy to
|
||||
* understand example. */
|
||||
void linenoiseAddCompletion(linenoiseCompletions *lc, const char *str) {
|
||||
size_t len = strlen(str);
|
||||
char *copy, **cvec;
|
||||
|
||||
copy = (char *)malloc(len + 1);
|
||||
if (copy == NULL)
|
||||
return;
|
||||
memcpy(copy, str, len + 1);
|
||||
cvec = (char **)realloc(lc->cvec, sizeof(char *) * (lc->len + 1));
|
||||
if (cvec == NULL) {
|
||||
free(copy);
|
||||
return;
|
||||
}
|
||||
lc->cvec = cvec;
|
||||
lc->cvec[lc->len++] = copy;
|
||||
}
|
||||
|
||||
size_t linenoiseComputeRenderWidth(const char *buf, size_t len) {
|
||||
return Linenoise::ComputeRenderWidth(buf, len);
|
||||
}
|
||||
|
||||
int linenoiseGetRenderPosition(const char *buf, size_t len, int max_width, int *n) {
|
||||
return Linenoise::GetRenderPosition(buf, len, max_width, n);
|
||||
}
|
||||
|
||||
void linenoiseClearScreen(void) {
|
||||
Terminal::ClearScreen();
|
||||
}
|
||||
|
||||
int linenoiseParseOption(const char **azArg, int nArg, const char **out_error) {
|
||||
return Linenoise::ParseOption(azArg, nArg, out_error);
|
||||
}
|
||||
1622
external/duckdb/tools/shell/linenoise/linenoise.cpp
vendored
Normal file
1622
external/duckdb/tools/shell/linenoise/linenoise.cpp
vendored
Normal file
File diff suppressed because it is too large
Load Diff
975
external/duckdb/tools/shell/linenoise/rendering.cpp
vendored
Normal file
975
external/duckdb/tools/shell/linenoise/rendering.cpp
vendored
Normal file
@@ -0,0 +1,975 @@
|
||||
#include "linenoise.hpp"
|
||||
#include "highlighting.hpp"
|
||||
#include "history.hpp"
|
||||
#include "utf8proc_wrapper.hpp"
|
||||
#include <unistd.h>
|
||||
|
||||
namespace duckdb {
|
||||
static const char *continuationPrompt = "> ";
|
||||
static const char *continuationSelectedPrompt = "> ";
|
||||
static bool enableCompletionRendering = false;
|
||||
static bool enableErrorRendering = true;
|
||||
|
||||
void Linenoise::EnableCompletionRendering() {
|
||||
enableCompletionRendering = true;
|
||||
}
|
||||
|
||||
void Linenoise::DisableCompletionRendering() {
|
||||
enableCompletionRendering = false;
|
||||
}
|
||||
|
||||
void Linenoise::EnableErrorRendering() {
|
||||
enableErrorRendering = true;
|
||||
}
|
||||
|
||||
void Linenoise::DisableErrorRendering() {
|
||||
enableErrorRendering = false;
|
||||
}
|
||||
|
||||
/* =========================== Line editing ================================= */
|
||||
|
||||
/* We define a very simple "append buffer" structure, that is an heap
|
||||
* allocated string where we can append to. This is useful in order to
|
||||
* write all the escape sequences in a buffer and flush them to the standard
|
||||
* output in a single call, to avoid flickering effects. */
|
||||
struct AppendBuffer {
|
||||
void Append(const char *s, idx_t len) {
|
||||
buffer.append(s, len);
|
||||
}
|
||||
void Append(const char *s) {
|
||||
buffer.append(s);
|
||||
}
|
||||
|
||||
void Write(int fd) {
|
||||
if (write(fd, buffer.c_str(), buffer.size()) == -1) {
|
||||
/* Can't recover from write error. */
|
||||
Linenoise::Log("%s", "Failed to write buffer\n");
|
||||
}
|
||||
}
|
||||
|
||||
private:
|
||||
std::string buffer;
|
||||
};
|
||||
|
||||
void Linenoise::SetPrompt(const char *continuation, const char *continuationSelected) {
|
||||
continuationPrompt = continuation;
|
||||
continuationSelectedPrompt = continuationSelected;
|
||||
}
|
||||
|
||||
/* Helper of refreshSingleLine() and refreshMultiLine() to show hints
|
||||
* to the right of the prompt. */
|
||||
void Linenoise::RefreshShowHints(AppendBuffer &append_buffer, int plen) const {
|
||||
char seq[64];
|
||||
auto hints_callback = Linenoise::HintsCallback();
|
||||
if (hints_callback && plen + len < size_t(ws.ws_col)) {
|
||||
int color = -1, bold = 0;
|
||||
char *hint = hints_callback(buf, &color, &bold);
|
||||
if (hint) {
|
||||
int hintlen = strlen(hint);
|
||||
int hintmaxlen = ws.ws_col - (plen + len);
|
||||
if (hintlen > hintmaxlen) {
|
||||
hintlen = hintmaxlen;
|
||||
}
|
||||
if (bold == 1 && color == -1)
|
||||
color = 37;
|
||||
if (color != -1 || bold != 0) {
|
||||
snprintf(seq, 64, "\033[%d;%d;49m", bold, color);
|
||||
} else {
|
||||
seq[0] = '\0';
|
||||
}
|
||||
append_buffer.Append(seq, strlen(seq));
|
||||
append_buffer.Append(hint, hintlen);
|
||||
if (color != -1 || bold != 0) {
|
||||
append_buffer.Append("\033[0m");
|
||||
}
|
||||
/* Call the function to free the hint returned. */
|
||||
auto free_hints_callback = Linenoise::FreeHintsCallback();
|
||||
if (free_hints_callback) {
|
||||
free_hints_callback(hint);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
static void renderText(size_t &render_pos, char *&buf, size_t &len, size_t pos, size_t cols, size_t plen,
|
||||
std::string &highlight_buffer, bool highlight, searchMatch *match = nullptr) {
|
||||
if (duckdb::Utf8Proc::IsValid(buf, len)) {
|
||||
// utf8 in prompt, handle rendering
|
||||
size_t remaining_render_width = cols - plen - 1;
|
||||
size_t start_pos = 0;
|
||||
size_t cpos = 0;
|
||||
size_t prev_pos = 0;
|
||||
size_t total_render_width = 0;
|
||||
while (cpos < len) {
|
||||
size_t char_render_width = duckdb::Utf8Proc::RenderWidth(buf, len, cpos);
|
||||
prev_pos = cpos;
|
||||
cpos = duckdb::Utf8Proc::NextGraphemeCluster(buf, len, cpos);
|
||||
total_render_width += cpos - prev_pos;
|
||||
if (total_render_width >= remaining_render_width) {
|
||||
// character does not fit anymore! we need to figure something out
|
||||
if (prev_pos >= pos) {
|
||||
// we passed the cursor: break
|
||||
cpos = prev_pos;
|
||||
break;
|
||||
} else {
|
||||
// we did not pass the cursor yet! remove characters from the start until it fits again
|
||||
while (total_render_width >= remaining_render_width) {
|
||||
size_t start_char_width = duckdb::Utf8Proc::RenderWidth(buf, len, start_pos);
|
||||
size_t new_start = duckdb::Utf8Proc::NextGraphemeCluster(buf, len, start_pos);
|
||||
total_render_width -= new_start - start_pos;
|
||||
start_pos = new_start;
|
||||
render_pos -= start_char_width;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (prev_pos < pos) {
|
||||
render_pos += char_render_width;
|
||||
}
|
||||
}
|
||||
if (highlight) {
|
||||
bool is_dot_command = buf[0] == '.';
|
||||
|
||||
auto tokens = Highlighting::Tokenize(buf, len, is_dot_command, match);
|
||||
highlight_buffer = Highlighting::HighlightText(buf, len, start_pos, cpos, tokens);
|
||||
buf = (char *)highlight_buffer.c_str();
|
||||
len = highlight_buffer.size();
|
||||
} else {
|
||||
buf = buf + start_pos;
|
||||
len = cpos - start_pos;
|
||||
}
|
||||
} else {
|
||||
// invalid UTF8: fallback
|
||||
while ((plen + pos) >= cols) {
|
||||
buf++;
|
||||
len--;
|
||||
pos--;
|
||||
}
|
||||
while (plen + len > cols) {
|
||||
len--;
|
||||
}
|
||||
render_pos = pos;
|
||||
}
|
||||
}
|
||||
|
||||
/* Single line low level line refresh.
|
||||
*
|
||||
* Rewrite the currently edited line accordingly to the buffer content,
|
||||
* cursor position, and number of columns of the terminal. */
|
||||
void Linenoise::RefreshSingleLine() const {
|
||||
char seq[64];
|
||||
size_t plen = GetPromptWidth();
|
||||
int fd = ofd;
|
||||
char *render_buf = buf;
|
||||
size_t render_len = len;
|
||||
size_t render_pos = 0;
|
||||
std::string highlight_buffer;
|
||||
|
||||
renderText(render_pos, render_buf, render_len, pos, ws.ws_col, plen, highlight_buffer, Highlighting::IsEnabled());
|
||||
|
||||
AppendBuffer append_buffer;
|
||||
/* Cursor to left edge */
|
||||
append_buffer.Append("\r");
|
||||
/* Write the prompt and the current buffer content */
|
||||
append_buffer.Append(prompt);
|
||||
append_buffer.Append(render_buf, render_len);
|
||||
/* Show hits if any. */
|
||||
RefreshShowHints(append_buffer, plen);
|
||||
/* Erase to right */
|
||||
append_buffer.Append("\x1b[0K");
|
||||
/* Move cursor to original position. */
|
||||
snprintf(seq, 64, "\r\x1b[%dC", (int)(render_pos + plen));
|
||||
append_buffer.Append(seq);
|
||||
append_buffer.Write(fd);
|
||||
}
|
||||
|
||||
void Linenoise::RefreshSearch() {
|
||||
std::string search_prompt;
|
||||
static const size_t SEARCH_PROMPT_RENDER_SIZE = 28;
|
||||
std::string no_matches_text = "(no matches)";
|
||||
bool no_matches = search_index >= search_matches.size();
|
||||
if (search_buf.empty()) {
|
||||
search_prompt = "search" + std::string(SEARCH_PROMPT_RENDER_SIZE - 8, ' ') + "> ";
|
||||
no_matches_text = "(type to search)";
|
||||
} else {
|
||||
std::string search_text;
|
||||
std::string matches_text;
|
||||
search_text += search_buf;
|
||||
if (!no_matches) {
|
||||
matches_text += std::to_string(search_index + 1);
|
||||
matches_text += "/" + std::to_string(search_matches.size());
|
||||
}
|
||||
size_t search_text_length = ComputeRenderWidth(search_text.c_str(), search_text.size());
|
||||
size_t matches_text_length = ComputeRenderWidth(matches_text.c_str(), matches_text.size());
|
||||
size_t total_text_length = search_text_length + matches_text_length;
|
||||
if (total_text_length < SEARCH_PROMPT_RENDER_SIZE - 2) {
|
||||
// search text is short: we can render the entire search text
|
||||
search_prompt = search_text;
|
||||
search_prompt += std::string(SEARCH_PROMPT_RENDER_SIZE - 2 - total_text_length, ' ');
|
||||
search_prompt += matches_text;
|
||||
search_prompt += "> ";
|
||||
} else {
|
||||
// search text length is too long to fit: truncate
|
||||
bool render_matches = matches_text_length < SEARCH_PROMPT_RENDER_SIZE - 8;
|
||||
char *search_buf = (char *)search_text.c_str();
|
||||
size_t search_len = search_text.size();
|
||||
size_t search_render_pos = 0;
|
||||
size_t max_render_size = SEARCH_PROMPT_RENDER_SIZE - 3;
|
||||
if (render_matches) {
|
||||
max_render_size -= matches_text_length;
|
||||
}
|
||||
std::string highlight_buffer;
|
||||
renderText(search_render_pos, search_buf, search_len, search_len, max_render_size, 0, highlight_buffer,
|
||||
false);
|
||||
search_prompt = std::string(search_buf, search_len);
|
||||
for (size_t i = search_render_pos; i < max_render_size; i++) {
|
||||
search_prompt += " ";
|
||||
}
|
||||
if (render_matches) {
|
||||
search_prompt += matches_text;
|
||||
}
|
||||
search_prompt += "> ";
|
||||
}
|
||||
}
|
||||
auto oldHighlighting = Highlighting::IsEnabled();
|
||||
Linenoise clone = *this;
|
||||
prompt = search_prompt.c_str();
|
||||
plen = search_prompt.size();
|
||||
if (no_matches || search_buf.empty()) {
|
||||
// if there are no matches render the no_matches_text
|
||||
buf = (char *)no_matches_text.c_str();
|
||||
len = no_matches_text.size();
|
||||
pos = 0;
|
||||
// don't highlight the "no_matches" text
|
||||
Highlighting::Disable();
|
||||
} else {
|
||||
// if there are matches render the current history item
|
||||
auto search_match = search_matches[search_index];
|
||||
auto history_index = search_match.history_index;
|
||||
auto cursor_position = search_match.match_end;
|
||||
buf = (char *)History::GetEntry(history_index);
|
||||
len = strlen(buf);
|
||||
pos = cursor_position;
|
||||
}
|
||||
RefreshLine();
|
||||
|
||||
if (oldHighlighting) {
|
||||
Highlighting::Enable();
|
||||
}
|
||||
buf = clone.buf;
|
||||
len = clone.len;
|
||||
pos = clone.pos;
|
||||
prompt = clone.prompt;
|
||||
plen = clone.plen;
|
||||
}
|
||||
|
||||
string Linenoise::AddContinuationMarkers(const char *buf, size_t len, int plen, int cursor_row,
|
||||
vector<highlightToken> &tokens) const {
|
||||
std::string result;
|
||||
int rows = 1;
|
||||
int cols = plen;
|
||||
size_t cpos = 0;
|
||||
size_t prev_pos = 0;
|
||||
size_t extra_bytes = 0; // extra bytes introduced
|
||||
size_t token_position = 0; // token position
|
||||
vector<highlightToken> new_tokens;
|
||||
new_tokens.reserve(tokens.size());
|
||||
while (cpos < len) {
|
||||
bool is_newline = IsNewline(buf[cpos]);
|
||||
NextPosition(buf, len, cpos, rows, cols, plen);
|
||||
for (; prev_pos < cpos; prev_pos++) {
|
||||
result += buf[prev_pos];
|
||||
}
|
||||
if (is_newline) {
|
||||
bool is_cursor_row = rows == cursor_row;
|
||||
const char *prompt = is_cursor_row ? continuationSelectedPrompt : continuationPrompt;
|
||||
if (!continuation_markers) {
|
||||
prompt = "";
|
||||
}
|
||||
size_t continuationLen = strlen(prompt);
|
||||
size_t continuationRender = ComputeRenderWidth(prompt, continuationLen);
|
||||
// pad with spaces prior to prompt
|
||||
for (int i = int(continuationRender); i < plen; i++) {
|
||||
result += " ";
|
||||
}
|
||||
result += prompt;
|
||||
size_t continuationBytes = plen - continuationRender + continuationLen;
|
||||
if (token_position < tokens.size()) {
|
||||
for (; token_position < tokens.size(); token_position++) {
|
||||
if (tokens[token_position].start >= cpos) {
|
||||
// not there yet
|
||||
break;
|
||||
}
|
||||
tokens[token_position].start += extra_bytes;
|
||||
new_tokens.push_back(tokens[token_position]);
|
||||
}
|
||||
tokenType prev_type = tokenType::TOKEN_IDENTIFIER;
|
||||
if (token_position > 0 && token_position < tokens.size() + 1) {
|
||||
prev_type = tokens[token_position - 1].type;
|
||||
}
|
||||
highlightToken token;
|
||||
token.start = cpos + extra_bytes;
|
||||
token.type = is_cursor_row ? tokenType::TOKEN_CONTINUATION_SELECTED : tokenType::TOKEN_CONTINUATION;
|
||||
token.search_match = false;
|
||||
new_tokens.push_back(token);
|
||||
|
||||
token.start = cpos + extra_bytes + continuationBytes;
|
||||
token.type = prev_type;
|
||||
token.search_match = false;
|
||||
new_tokens.push_back(token);
|
||||
}
|
||||
extra_bytes += continuationBytes;
|
||||
}
|
||||
}
|
||||
for (; prev_pos < cpos; prev_pos++) {
|
||||
result += buf[prev_pos];
|
||||
}
|
||||
for (; token_position < tokens.size(); token_position++) {
|
||||
tokens[token_position].start += extra_bytes;
|
||||
new_tokens.push_back(tokens[token_position]);
|
||||
}
|
||||
tokens = std::move(new_tokens);
|
||||
return result;
|
||||
}
|
||||
|
||||
// insert a token of length 1 of the specified type
|
||||
static void InsertToken(tokenType insert_type, idx_t insert_pos, vector<highlightToken> &tokens) {
|
||||
vector<highlightToken> new_tokens;
|
||||
new_tokens.reserve(tokens.size() + 1);
|
||||
idx_t i;
|
||||
bool found = false;
|
||||
for (i = 0; i < tokens.size(); i++) {
|
||||
// find the exact position where we need to insert the token
|
||||
if (tokens[i].start == insert_pos) {
|
||||
// this token is exactly at this render position
|
||||
|
||||
// insert highlighting for the bracket
|
||||
highlightToken token;
|
||||
token.start = insert_pos;
|
||||
token.type = insert_type;
|
||||
token.search_match = false;
|
||||
new_tokens.push_back(token);
|
||||
|
||||
// now we need to insert the other token ONLY if the other token is not immediately following this one
|
||||
if (i + 1 >= tokens.size() || tokens[i + 1].start > insert_pos + 1) {
|
||||
token.start = insert_pos + 1;
|
||||
token.type = tokens[i].type;
|
||||
token.search_match = false;
|
||||
new_tokens.push_back(token);
|
||||
}
|
||||
i++;
|
||||
found = true;
|
||||
break;
|
||||
} else if (tokens[i].start > insert_pos) {
|
||||
// the next token is AFTER the render position
|
||||
// insert highlighting for the bracket
|
||||
highlightToken token;
|
||||
token.start = insert_pos;
|
||||
token.type = insert_type;
|
||||
token.search_match = false;
|
||||
new_tokens.push_back(token);
|
||||
|
||||
// now just insert the next token
|
||||
new_tokens.push_back(tokens[i]);
|
||||
i++;
|
||||
found = true;
|
||||
break;
|
||||
} else {
|
||||
// insert the token
|
||||
new_tokens.push_back(tokens[i]);
|
||||
}
|
||||
}
|
||||
// copy over the remaining tokens
|
||||
for (; i < tokens.size(); i++) {
|
||||
new_tokens.push_back(tokens[i]);
|
||||
}
|
||||
if (!found) {
|
||||
// token was not added - add it to the end
|
||||
highlightToken token;
|
||||
token.start = insert_pos;
|
||||
token.type = insert_type;
|
||||
token.search_match = false;
|
||||
new_tokens.push_back(token);
|
||||
}
|
||||
tokens = std::move(new_tokens);
|
||||
}
|
||||
|
||||
enum class ScanState { STANDARD, IN_SINGLE_QUOTE, IN_DOUBLE_QUOTE, IN_COMMENT, DOLLAR_QUOTED_STRING };
|
||||
|
||||
static void OpenBracket(vector<idx_t> &brackets, vector<idx_t> &cursor_brackets, idx_t pos, idx_t i) {
|
||||
// check if the cursor is at this position
|
||||
if (pos == i) {
|
||||
// cursor is exactly on this position - always highlight this bracket
|
||||
if (!cursor_brackets.empty()) {
|
||||
cursor_brackets.clear();
|
||||
}
|
||||
cursor_brackets.push_back(i);
|
||||
}
|
||||
if (cursor_brackets.empty() && ((i + 1) == pos || (pos + 1) == i)) {
|
||||
// cursor is either BEFORE or AFTER this bracket and we don't have any highlighted bracket yet
|
||||
// highlight this bracket
|
||||
cursor_brackets.push_back(i);
|
||||
}
|
||||
brackets.push_back(i);
|
||||
}
|
||||
|
||||
static void CloseBracket(vector<idx_t> &brackets, vector<idx_t> &cursor_brackets, idx_t pos, idx_t i,
|
||||
vector<idx_t> &errors) {
|
||||
if (pos == i) {
|
||||
// cursor is on this closing bracket
|
||||
// clear any selected brackets - we always select this one
|
||||
cursor_brackets.clear();
|
||||
}
|
||||
if (brackets.empty()) {
|
||||
// closing bracket without matching opening bracket
|
||||
errors.push_back(i);
|
||||
} else {
|
||||
if (cursor_brackets.size() == 1) {
|
||||
if (cursor_brackets.back() == brackets.back()) {
|
||||
// this closing bracket matches the highlighted opening cursor bracket - highlight both
|
||||
cursor_brackets.push_back(i);
|
||||
}
|
||||
} else if (cursor_brackets.empty() && (pos == i || (i + 1) == pos || (pos + 1) == i)) {
|
||||
// no cursor bracket selected yet and cursor is BEFORE or AFTER this bracket
|
||||
// add this bracket
|
||||
cursor_brackets.push_back(i);
|
||||
cursor_brackets.push_back(brackets.back());
|
||||
}
|
||||
brackets.pop_back();
|
||||
}
|
||||
}
|
||||
|
||||
static void HandleBracketErrors(const vector<idx_t> &brackets, vector<idx_t> &errors) {
|
||||
if (brackets.empty()) {
|
||||
return;
|
||||
}
|
||||
// if there are unclosed brackets remaining not all brackets were closed
|
||||
for (auto &bracket : brackets) {
|
||||
errors.push_back(bracket);
|
||||
}
|
||||
}
|
||||
|
||||
void Linenoise::AddErrorHighlighting(idx_t render_start, idx_t render_end, vector<highlightToken> &tokens) const {
|
||||
static constexpr const idx_t MAX_ERROR_LENGTH = 2000;
|
||||
if (!enableErrorRendering) {
|
||||
return;
|
||||
}
|
||||
if (len >= MAX_ERROR_LENGTH) {
|
||||
return;
|
||||
}
|
||||
// do a pass over the buffer to collect errors:
|
||||
// * brackets without matching closing/opening bracket
|
||||
// * single quotes without matching closing single quote
|
||||
// * double quote without matching double quote
|
||||
ScanState state = ScanState::STANDARD;
|
||||
vector<idx_t> brackets; // ()
|
||||
vector<idx_t> square_brackets; // []
|
||||
vector<idx_t> curly_brackets; // {}
|
||||
vector<idx_t> errors;
|
||||
vector<idx_t> cursor_brackets;
|
||||
vector<idx_t> comment_start;
|
||||
vector<idx_t> comment_end;
|
||||
string dollar_quote_marker;
|
||||
idx_t quote_pos = 0;
|
||||
for (idx_t i = 0; i < len; i++) {
|
||||
auto c = buf[i];
|
||||
switch (state) {
|
||||
case ScanState::STANDARD:
|
||||
switch (c) {
|
||||
case '-':
|
||||
if (i + 1 < len && buf[i + 1] == '-') {
|
||||
// -- puts us in a comment
|
||||
comment_start.push_back(i);
|
||||
i++;
|
||||
state = ScanState::IN_COMMENT;
|
||||
break;
|
||||
}
|
||||
break;
|
||||
case '\'':
|
||||
state = ScanState::IN_SINGLE_QUOTE;
|
||||
quote_pos = i;
|
||||
break;
|
||||
case '\"':
|
||||
state = ScanState::IN_DOUBLE_QUOTE;
|
||||
quote_pos = i;
|
||||
break;
|
||||
case '(':
|
||||
OpenBracket(brackets, cursor_brackets, pos, i);
|
||||
break;
|
||||
case '[':
|
||||
OpenBracket(square_brackets, cursor_brackets, pos, i);
|
||||
break;
|
||||
case '{':
|
||||
OpenBracket(curly_brackets, cursor_brackets, pos, i);
|
||||
break;
|
||||
case ')':
|
||||
CloseBracket(brackets, cursor_brackets, pos, i, errors);
|
||||
break;
|
||||
case ']':
|
||||
CloseBracket(square_brackets, cursor_brackets, pos, i, errors);
|
||||
break;
|
||||
case '}':
|
||||
CloseBracket(curly_brackets, cursor_brackets, pos, i, errors);
|
||||
break;
|
||||
case '$': { // dollar symbol
|
||||
if (i + 1 >= len) {
|
||||
// we need more than just a dollar
|
||||
break;
|
||||
}
|
||||
// check if this is a dollar-quoted string
|
||||
idx_t next_dollar = 0;
|
||||
for (idx_t idx = i + 1; idx < len; idx++) {
|
||||
if (buf[idx] == '$') {
|
||||
// found the next dollar
|
||||
next_dollar = idx;
|
||||
break;
|
||||
}
|
||||
// all characters can be between A-Z, a-z or \200 - \377
|
||||
if (buf[idx] >= 'A' && buf[idx] <= 'Z') {
|
||||
continue;
|
||||
}
|
||||
if (buf[idx] >= 'a' && buf[idx] <= 'z') {
|
||||
continue;
|
||||
}
|
||||
if (buf[idx] >= '\200' && buf[idx] <= '\377') {
|
||||
continue;
|
||||
}
|
||||
// the first character CANNOT be a numeric, only subsequent characters
|
||||
if (idx > i + 1 && buf[idx] >= '0' && buf[idx] <= '9') {
|
||||
continue;
|
||||
}
|
||||
// not a dollar quoted string
|
||||
break;
|
||||
}
|
||||
if (next_dollar == 0) {
|
||||
// not a dollar quoted string
|
||||
break;
|
||||
}
|
||||
// dollar quoted string
|
||||
state = ScanState::DOLLAR_QUOTED_STRING;
|
||||
quote_pos = i;
|
||||
i = next_dollar;
|
||||
if (i < len) {
|
||||
// found a complete marker - store it
|
||||
idx_t marker_start = quote_pos + 1;
|
||||
dollar_quote_marker = string(buf + marker_start, i - marker_start);
|
||||
}
|
||||
break;
|
||||
}
|
||||
default:
|
||||
break;
|
||||
}
|
||||
break;
|
||||
case ScanState::IN_COMMENT:
|
||||
// comment state - the only thing that will get us out is a newline
|
||||
switch (c) {
|
||||
case '\r':
|
||||
case '\n':
|
||||
// newline - left comment state
|
||||
state = ScanState::STANDARD;
|
||||
comment_end.push_back(i);
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
break;
|
||||
case ScanState::IN_SINGLE_QUOTE:
|
||||
// single quote - all that will get us out is an unescaped single-quote
|
||||
if (c == '\'') {
|
||||
if (i + 1 < len && buf[i + 1] == '\'') {
|
||||
// double single-quote means the quote is escaped - continue
|
||||
i++;
|
||||
break;
|
||||
} else {
|
||||
// otherwise revert to standard scan state
|
||||
state = ScanState::STANDARD;
|
||||
break;
|
||||
}
|
||||
}
|
||||
break;
|
||||
case ScanState::IN_DOUBLE_QUOTE:
|
||||
// double quote - all that will get us out is an unescaped quote
|
||||
if (c == '"') {
|
||||
if (i + 1 < len && buf[i + 1] == '"') {
|
||||
// double quote means the quote is escaped - continue
|
||||
i++;
|
||||
break;
|
||||
} else {
|
||||
// otherwise revert to standard scan state
|
||||
state = ScanState::STANDARD;
|
||||
break;
|
||||
}
|
||||
}
|
||||
break;
|
||||
case ScanState::DOLLAR_QUOTED_STRING: {
|
||||
// dollar-quoted string - all that will get us out is a $[marker]$
|
||||
if (c != '$') {
|
||||
break;
|
||||
}
|
||||
if (i + 1 >= len) {
|
||||
// no room for the final dollar
|
||||
break;
|
||||
}
|
||||
// skip to the next dollar symbol
|
||||
idx_t start = i + 1;
|
||||
idx_t end = start;
|
||||
while (end < len && buf[end] != '$') {
|
||||
end++;
|
||||
}
|
||||
if (end >= len) {
|
||||
// no final dollar found - continue as normal
|
||||
break;
|
||||
}
|
||||
if (end - start != dollar_quote_marker.size()) {
|
||||
// length mismatch - cannot match
|
||||
break;
|
||||
}
|
||||
if (memcmp(buf + start, dollar_quote_marker.c_str(), dollar_quote_marker.size()) != 0) {
|
||||
// marker mismatch
|
||||
break;
|
||||
}
|
||||
// marker found! revert to standard state
|
||||
dollar_quote_marker = string();
|
||||
state = ScanState::STANDARD;
|
||||
i = end;
|
||||
break;
|
||||
}
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (state == ScanState::IN_DOUBLE_QUOTE || state == ScanState::IN_SINGLE_QUOTE ||
|
||||
state == ScanState::DOLLAR_QUOTED_STRING) {
|
||||
// quote is never closed
|
||||
errors.push_back(quote_pos);
|
||||
}
|
||||
HandleBracketErrors(brackets, errors);
|
||||
HandleBracketErrors(square_brackets, errors);
|
||||
HandleBracketErrors(curly_brackets, errors);
|
||||
|
||||
// insert all the errors for highlighting
|
||||
for (auto &error : errors) {
|
||||
Linenoise::Log("Error found at position %llu\n", error);
|
||||
if (error < render_start || error > render_end) {
|
||||
continue;
|
||||
}
|
||||
auto render_error = error - render_start;
|
||||
InsertToken(tokenType::TOKEN_ERROR, render_error, tokens);
|
||||
}
|
||||
if (cursor_brackets.size() != 2) {
|
||||
// no matching cursor brackets found
|
||||
cursor_brackets.clear();
|
||||
}
|
||||
// insert bracket for highlighting
|
||||
for (auto &bracket_position : cursor_brackets) {
|
||||
Linenoise::Log("Highlight bracket at position %d\n", bracket_position);
|
||||
if (bracket_position < render_start || bracket_position > render_end) {
|
||||
continue;
|
||||
}
|
||||
|
||||
idx_t render_position = bracket_position - render_start;
|
||||
InsertToken(tokenType::TOKEN_BRACKET, render_position, tokens);
|
||||
}
|
||||
// insert comments
|
||||
if (!comment_start.empty()) {
|
||||
vector<highlightToken> new_tokens;
|
||||
new_tokens.reserve(tokens.size());
|
||||
idx_t token_idx = 0;
|
||||
for (idx_t c = 0; c < comment_start.size(); c++) {
|
||||
auto c_start = comment_start[c];
|
||||
auto c_end = c < comment_end.size() ? comment_end[c] : len;
|
||||
if (c_start < render_start || c_end > render_end) {
|
||||
continue;
|
||||
}
|
||||
Linenoise::Log("Comment at position %d to %d\n", c_start, c_end);
|
||||
c_start -= render_start;
|
||||
c_end -= render_start;
|
||||
bool inserted_comment = false;
|
||||
|
||||
highlightToken comment_token;
|
||||
comment_token.start = c_start;
|
||||
comment_token.type = tokenType::TOKEN_COMMENT;
|
||||
comment_token.search_match = false;
|
||||
|
||||
for (; token_idx < tokens.size(); token_idx++) {
|
||||
if (tokens[token_idx].start >= c_start) {
|
||||
// insert the comment here
|
||||
new_tokens.push_back(comment_token);
|
||||
inserted_comment = true;
|
||||
break;
|
||||
}
|
||||
new_tokens.push_back(tokens[token_idx]);
|
||||
}
|
||||
if (!inserted_comment) {
|
||||
new_tokens.push_back(comment_token);
|
||||
} else {
|
||||
// skip all tokens until we exit the comment again
|
||||
for (; token_idx < tokens.size(); token_idx++) {
|
||||
if (tokens[token_idx].start > c_end) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
for (; token_idx < tokens.size(); token_idx++) {
|
||||
new_tokens.push_back(tokens[token_idx]);
|
||||
}
|
||||
tokens = std::move(new_tokens);
|
||||
}
|
||||
}
|
||||
|
||||
static bool IsCompletionCharacter(char c) {
|
||||
if (c >= 'A' && c <= 'Z') {
|
||||
return true;
|
||||
}
|
||||
if (c >= 'a' && c <= 'z') {
|
||||
return true;
|
||||
}
|
||||
if (c == '_') {
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
bool Linenoise::AddCompletionMarker(const char *buf, idx_t len, string &result_buffer,
|
||||
vector<highlightToken> &tokens) const {
|
||||
if (!enableCompletionRendering) {
|
||||
return false;
|
||||
}
|
||||
if (!continuation_markers) {
|
||||
// don't render when pressing ctrl+c, only when editing
|
||||
return false;
|
||||
}
|
||||
static constexpr const idx_t MAX_COMPLETION_LENGTH = 1000;
|
||||
if (len >= MAX_COMPLETION_LENGTH) {
|
||||
return false;
|
||||
}
|
||||
if (!insert || pos != len) {
|
||||
// only show when inserting a character at the end
|
||||
return false;
|
||||
}
|
||||
if (pos < 3) {
|
||||
// we need at least 3 bytes
|
||||
return false;
|
||||
}
|
||||
if (!tokens.empty() && tokens.back().type == tokenType::TOKEN_ERROR) {
|
||||
// don't show auto-completion when we have errors
|
||||
return false;
|
||||
}
|
||||
// we ONLY show completion if we have typed at least three characters that are supported for completion
|
||||
// for now this is ONLY the characters a-z, A-Z and underscore (_)
|
||||
for (idx_t i = pos - 3; i < pos; i++) {
|
||||
if (!IsCompletionCharacter(buf[i])) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
auto completion = TabComplete();
|
||||
if (completion.completions.empty()) {
|
||||
// no completions found
|
||||
return false;
|
||||
}
|
||||
if (completion.completions[0].completion.size() <= len) {
|
||||
// completion is not long enough
|
||||
return false;
|
||||
}
|
||||
// we have stricter requirements for rendering completions - the completion must match exactly
|
||||
for (idx_t i = pos; i > 0; i--) {
|
||||
auto cpos = i - 1;
|
||||
if (!IsCompletionCharacter(buf[cpos])) {
|
||||
break;
|
||||
}
|
||||
if (completion.completions[0].completion[cpos] != buf[cpos]) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
// add the first completion found for rendering purposes
|
||||
result_buffer = string(buf, len);
|
||||
result_buffer += completion.completions[0].completion.substr(len);
|
||||
|
||||
highlightToken completion_token;
|
||||
completion_token.start = len;
|
||||
completion_token.type = tokenType::TOKEN_COMMENT;
|
||||
completion_token.search_match = true;
|
||||
tokens.push_back(completion_token);
|
||||
return true;
|
||||
}
|
||||
|
||||
/* Multi line low level line refresh.
|
||||
*
|
||||
* Rewrite the currently edited line accordingly to the buffer content,
|
||||
* cursor position, and number of columns of the terminal. */
|
||||
void Linenoise::RefreshMultiLine() {
|
||||
if (!render) {
|
||||
return;
|
||||
}
|
||||
char seq[64];
|
||||
int plen = GetPromptWidth();
|
||||
// utf8 in prompt, get render width
|
||||
int rows, cols;
|
||||
int new_cursor_row, new_cursor_x;
|
||||
PositionToColAndRow(pos, new_cursor_row, new_cursor_x, rows, cols);
|
||||
int col; /* column position, zero-based. */
|
||||
int old_rows = maxrows ? maxrows : 1;
|
||||
int fd = ofd;
|
||||
std::string highlight_buffer;
|
||||
auto render_buf = this->buf;
|
||||
auto render_len = this->len;
|
||||
idx_t render_start = 0;
|
||||
idx_t render_end = render_len;
|
||||
if (clear_screen) {
|
||||
old_cursor_rows = 0;
|
||||
old_rows = 0;
|
||||
clear_screen = false;
|
||||
}
|
||||
if (rows > ws.ws_row) {
|
||||
// the text does not fit in the terminal (too many rows)
|
||||
// enable scrolling mode
|
||||
// check if, given the current y_scroll, the cursor is visible
|
||||
// display range is [y_scroll, y_scroll + ws.ws_row]
|
||||
if (new_cursor_row < int(y_scroll) + 1) {
|
||||
y_scroll = new_cursor_row - 1;
|
||||
} else if (new_cursor_row > int(y_scroll) + int(ws.ws_row)) {
|
||||
y_scroll = new_cursor_row - ws.ws_row;
|
||||
}
|
||||
// display only characters up to the current scroll position
|
||||
if (y_scroll == 0) {
|
||||
render_start = 0;
|
||||
} else {
|
||||
render_start = ColAndRowToPosition(y_scroll, 0);
|
||||
}
|
||||
if (int(y_scroll) + int(ws.ws_row) >= rows) {
|
||||
render_end = len;
|
||||
} else {
|
||||
render_end = ColAndRowToPosition(y_scroll + ws.ws_row, 99999);
|
||||
}
|
||||
new_cursor_row -= y_scroll;
|
||||
render_buf += render_start;
|
||||
render_len = render_end - render_start;
|
||||
Linenoise::Log("truncate to rows %d - %d (render bytes %d to %d)", y_scroll, y_scroll + ws.ws_row, render_start,
|
||||
render_end);
|
||||
rows = ws.ws_row;
|
||||
} else {
|
||||
y_scroll = 0;
|
||||
}
|
||||
|
||||
/* Update maxrows if needed. */
|
||||
if (rows > (int)maxrows) {
|
||||
maxrows = rows;
|
||||
}
|
||||
|
||||
vector<highlightToken> tokens;
|
||||
if (Highlighting::IsEnabled()) {
|
||||
bool is_dot_command = buf[0] == '.';
|
||||
auto match = search_index < search_matches.size() ? &search_matches[search_index] : nullptr;
|
||||
tokens = Highlighting::Tokenize(render_buf, render_len, is_dot_command, match);
|
||||
|
||||
// add error highlighting
|
||||
AddErrorHighlighting(render_start, render_end, tokens);
|
||||
|
||||
// add completion hint
|
||||
if (AddCompletionMarker(render_buf, render_len, highlight_buffer, tokens)) {
|
||||
render_buf = (char *)highlight_buffer.c_str();
|
||||
render_len = highlight_buffer.size();
|
||||
}
|
||||
}
|
||||
if (rows > 1) {
|
||||
// add continuation markers
|
||||
highlight_buffer = AddContinuationMarkers(render_buf, render_len, plen,
|
||||
y_scroll > 0 ? new_cursor_row + 1 : new_cursor_row, tokens);
|
||||
render_buf = (char *)highlight_buffer.c_str();
|
||||
render_len = highlight_buffer.size();
|
||||
}
|
||||
if (duckdb::Utf8Proc::IsValid(render_buf, render_len)) {
|
||||
if (Highlighting::IsEnabled()) {
|
||||
highlight_buffer = Highlighting::HighlightText(render_buf, render_len, 0, render_len, tokens);
|
||||
render_buf = (char *)highlight_buffer.c_str();
|
||||
render_len = highlight_buffer.size();
|
||||
}
|
||||
}
|
||||
|
||||
/* First step: clear all the lines used before. To do so start by
|
||||
* going to the last row. */
|
||||
AppendBuffer append_buffer;
|
||||
if (old_rows - old_cursor_rows > 0) {
|
||||
Linenoise::Log("go down %d", old_rows - old_cursor_rows);
|
||||
snprintf(seq, 64, "\x1b[%dB", old_rows - int(old_cursor_rows));
|
||||
append_buffer.Append(seq);
|
||||
}
|
||||
|
||||
/* Now for every row clear it, go up. */
|
||||
for (int j = 0; j < old_rows - 1; j++) {
|
||||
Linenoise::Log("clear+up");
|
||||
append_buffer.Append("\r\x1b[0K\x1b[1A");
|
||||
}
|
||||
|
||||
/* Clean the top line. */
|
||||
Linenoise::Log("clear");
|
||||
append_buffer.Append("\r\x1b[0K");
|
||||
|
||||
/* Write the prompt and the current buffer content */
|
||||
if (y_scroll == 0) {
|
||||
append_buffer.Append(prompt);
|
||||
}
|
||||
append_buffer.Append(render_buf, render_len);
|
||||
|
||||
/* Show hints if any. */
|
||||
RefreshShowHints(append_buffer, plen);
|
||||
|
||||
/* If we are at the very end of the screen with our prompt, we need to
|
||||
* emit a newline and move the prompt to the first column. */
|
||||
Linenoise::Log("pos > 0 %d", pos > 0 ? 1 : 0);
|
||||
Linenoise::Log("pos == len %d", pos == len ? 1 : 0);
|
||||
Linenoise::Log("new_cursor_x == cols %d", new_cursor_x == ws.ws_col ? 1 : 0);
|
||||
if (pos > 0 && pos == len && new_cursor_x == ws.ws_col) {
|
||||
Linenoise::Log("<newline>", 0);
|
||||
append_buffer.Append("\n");
|
||||
append_buffer.Append("\r");
|
||||
rows++;
|
||||
new_cursor_row++;
|
||||
new_cursor_x = 0;
|
||||
if (rows > (int)maxrows) {
|
||||
maxrows = rows;
|
||||
}
|
||||
}
|
||||
Linenoise::Log("render %d rows (old rows %d)", rows, old_rows);
|
||||
|
||||
/* Move cursor to right position. */
|
||||
Linenoise::Log("new_cursor_row %d", new_cursor_row);
|
||||
Linenoise::Log("new_cursor_x %d", new_cursor_x);
|
||||
Linenoise::Log("len %d", len);
|
||||
Linenoise::Log("old_cursor_rows %d", old_cursor_rows);
|
||||
Linenoise::Log("pos %d", pos);
|
||||
Linenoise::Log("max cols %d", ws.ws_col);
|
||||
|
||||
/* Go up till we reach the expected position. */
|
||||
if (rows - new_cursor_row > 0) {
|
||||
Linenoise::Log("go-up %d", rows - new_cursor_row);
|
||||
snprintf(seq, 64, "\x1b[%dA", rows - new_cursor_row);
|
||||
append_buffer.Append(seq);
|
||||
}
|
||||
|
||||
/* Set column. */
|
||||
col = new_cursor_x;
|
||||
Linenoise::Log("set col %d", 1 + col);
|
||||
if (col) {
|
||||
snprintf(seq, 64, "\r\x1b[%dC", col);
|
||||
} else {
|
||||
snprintf(seq, 64, "\r");
|
||||
}
|
||||
append_buffer.Append(seq);
|
||||
|
||||
Linenoise::Log("\n");
|
||||
old_cursor_rows = new_cursor_row;
|
||||
append_buffer.Write(fd);
|
||||
}
|
||||
|
||||
/* Calls the two low level functions refreshSingleLine() or
|
||||
* refreshMultiLine() according to the selected mode. */
|
||||
void Linenoise::RefreshLine() {
|
||||
if (Terminal::IsMultiline()) {
|
||||
RefreshMultiLine();
|
||||
} else {
|
||||
RefreshSingleLine();
|
||||
}
|
||||
}
|
||||
|
||||
} // namespace duckdb
|
||||
496
external/duckdb/tools/shell/linenoise/terminal.cpp
vendored
Normal file
496
external/duckdb/tools/shell/linenoise/terminal.cpp
vendored
Normal file
@@ -0,0 +1,496 @@
|
||||
#include "terminal.hpp"
|
||||
#include "history.hpp"
|
||||
#include "linenoise.hpp"
|
||||
#include <termios.h>
|
||||
#include <unistd.h>
|
||||
#include <stdlib.h>
|
||||
#include <stdio.h>
|
||||
#include <errno.h>
|
||||
#include <string.h>
|
||||
#include <stdlib.h>
|
||||
#include <ctype.h>
|
||||
#include <sys/ioctl.h>
|
||||
#include <sys/select.h>
|
||||
#include <sys/time.h>
|
||||
|
||||
namespace duckdb {
|
||||
|
||||
static int mlmode = 1; /* Multi line mode. Default is multi line. */
|
||||
static struct termios orig_termios; /* In order to restore at exit.*/
|
||||
static int atexit_registered = 0; /* Register atexit just 1 time. */
|
||||
static int rawmode = 0; /* For atexit() function to check if restore is needed*/
|
||||
static const char *unsupported_term[] = {"dumb", "cons25", "emacs", NULL};
|
||||
|
||||
/* At exit we'll try to fix the terminal to the initial conditions. */
|
||||
static void linenoiseAtExit(void) {
|
||||
Terminal::DisableRawMode();
|
||||
History::Free();
|
||||
}
|
||||
|
||||
/* Return true if the terminal name is in the list of terminals we know are
|
||||
* not able to understand basic escape sequences. */
|
||||
int Terminal::IsUnsupportedTerm() {
|
||||
char *term = getenv("TERM");
|
||||
int j;
|
||||
|
||||
if (!term) {
|
||||
return 0;
|
||||
}
|
||||
for (j = 0; unsupported_term[j]; j++) {
|
||||
if (!strcasecmp(term, unsupported_term[j])) {
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* Raw mode: 1960 magic shit. */
|
||||
int Terminal::EnableRawMode() {
|
||||
int fd = STDIN_FILENO;
|
||||
|
||||
if (!isatty(STDIN_FILENO)) {
|
||||
errno = ENOTTY;
|
||||
return -1;
|
||||
}
|
||||
if (!atexit_registered) {
|
||||
atexit(linenoiseAtExit);
|
||||
atexit_registered = 1;
|
||||
}
|
||||
if (tcgetattr(fd, &orig_termios) == -1) {
|
||||
errno = ENOTTY;
|
||||
return -1;
|
||||
}
|
||||
|
||||
auto raw = orig_termios; /* modify the original mode */
|
||||
/* input modes: no break, no CR to NL, no parity check, no strip char,
|
||||
* no start/stop output control. */
|
||||
raw.c_iflag &= ~(BRKINT | ICRNL | INPCK | ISTRIP | IXON);
|
||||
/* output modes - disable post processing */
|
||||
raw.c_oflag &= ~(OPOST);
|
||||
#ifdef IUTF8
|
||||
/* control modes - set 8 bit chars */
|
||||
raw.c_iflag |= IUTF8;
|
||||
#endif
|
||||
raw.c_cflag |= CS8;
|
||||
/* local modes - choing off, canonical off, no extended functions,
|
||||
* no signal chars (^Z,^C) */
|
||||
raw.c_lflag &= ~(ECHO | ICANON | IEXTEN | ISIG);
|
||||
/* control chars - set return condition: min number of bytes and timer.
|
||||
* We want read to return every single byte, without timeout. */
|
||||
raw.c_cc[VMIN] = 1;
|
||||
raw.c_cc[VTIME] = 0; /* 1 byte, no timer */
|
||||
|
||||
/* put terminal in raw mode after flushing */
|
||||
if (tcsetattr(fd, TCSADRAIN, &raw) < 0) {
|
||||
errno = ENOTTY;
|
||||
return -1;
|
||||
}
|
||||
rawmode = 1;
|
||||
return 0;
|
||||
}
|
||||
|
||||
void Terminal::DisableRawMode() {
|
||||
int fd = STDIN_FILENO;
|
||||
/* Don't even check the return value as it's too late. */
|
||||
if (rawmode && tcsetattr(fd, TCSADRAIN, &orig_termios) != -1) {
|
||||
rawmode = 0;
|
||||
}
|
||||
}
|
||||
|
||||
bool Terminal::IsMultiline() {
|
||||
return mlmode;
|
||||
}
|
||||
|
||||
bool Terminal::IsAtty() {
|
||||
return isatty(STDIN_FILENO);
|
||||
}
|
||||
|
||||
/* This function is called when linenoise() is called with the standard
|
||||
* input file descriptor not attached to a TTY. So for example when the
|
||||
* program using linenoise is called in pipe or with a file redirected
|
||||
* to its standard input. In this case, we want to be able to return the
|
||||
* line regardless of its length (by default we are limited to 4k). */
|
||||
char *Terminal::EditNoTTY() {
|
||||
char *line = NULL;
|
||||
size_t len = 0, maxlen = 0;
|
||||
|
||||
while (1) {
|
||||
if (len == maxlen) {
|
||||
if (maxlen == 0)
|
||||
maxlen = 16;
|
||||
maxlen *= 2;
|
||||
char *oldval = line;
|
||||
line = (char *)realloc(line, maxlen);
|
||||
if (line == NULL) {
|
||||
if (oldval)
|
||||
free(oldval);
|
||||
return NULL;
|
||||
}
|
||||
}
|
||||
int c = fgetc(stdin);
|
||||
if (c == EOF || c == '\n') {
|
||||
if (c == EOF && len == 0) {
|
||||
free(line);
|
||||
return NULL;
|
||||
} else {
|
||||
line[len] = '\0';
|
||||
return line;
|
||||
}
|
||||
} else {
|
||||
line[len] = c;
|
||||
len++;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* This function calls the line editing function linenoiseEdit() using
|
||||
* the STDIN file descriptor set in raw mode. */
|
||||
int Terminal::EditRaw(char *buf, size_t buflen, const char *prompt) {
|
||||
int count;
|
||||
|
||||
if (buflen == 0) {
|
||||
errno = EINVAL;
|
||||
return -1;
|
||||
}
|
||||
|
||||
if (Terminal::EnableRawMode() == -1) {
|
||||
return -1;
|
||||
}
|
||||
Linenoise l(STDIN_FILENO, STDOUT_FILENO, buf, buflen, prompt);
|
||||
count = l.Edit();
|
||||
Terminal::DisableRawMode();
|
||||
printf("\n");
|
||||
return count;
|
||||
}
|
||||
|
||||
// returns true if there is more data available to read in a particular stream
|
||||
int Terminal::HasMoreData(int fd) {
|
||||
fd_set rfds;
|
||||
FD_ZERO(&rfds);
|
||||
FD_SET(fd, &rfds);
|
||||
|
||||
// no timeout: return immediately
|
||||
struct timeval tv;
|
||||
tv.tv_sec = 0;
|
||||
tv.tv_usec = 0;
|
||||
return select(1, &rfds, NULL, NULL, &tv);
|
||||
}
|
||||
|
||||
/* ======================= Low level terminal handling ====================== */
|
||||
|
||||
/* Set if to use or not the multi line mode. */
|
||||
void Terminal::SetMultiLine(int ml) {
|
||||
mlmode = ml;
|
||||
}
|
||||
|
||||
static int parseInt(const char *s, int *offset = nullptr) {
|
||||
int result = 0;
|
||||
int idx;
|
||||
for (idx = 0; s[idx]; idx++) {
|
||||
char c = s[idx];
|
||||
if (c < '0' || c > '9') {
|
||||
break;
|
||||
}
|
||||
result = result * 10 + c - '0';
|
||||
if (result > 1000000) {
|
||||
result = 1000000;
|
||||
}
|
||||
}
|
||||
if (offset) {
|
||||
*offset = idx;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
static int tryParseEnv(const char *env_var) {
|
||||
char *s;
|
||||
s = getenv(env_var);
|
||||
if (!s) {
|
||||
return 0;
|
||||
}
|
||||
return parseInt(s);
|
||||
}
|
||||
|
||||
/* Use the ESC [6n escape sequence to query the cursor position
|
||||
* and return it. On error -1 is returned, on success the position of the
|
||||
* cursor. */
|
||||
TerminalSize Terminal::GetCursorPosition() {
|
||||
int ifd = STDIN_FILENO;
|
||||
int ofd = STDOUT_FILENO;
|
||||
TerminalSize ws;
|
||||
|
||||
char buf[32];
|
||||
unsigned int i = 0;
|
||||
|
||||
/* Report cursor location */
|
||||
if (write(ofd, "\x1b[6n", 4) != 4) {
|
||||
return ws;
|
||||
}
|
||||
|
||||
/* Read the response: ESC [ rows ; cols R */
|
||||
while (i < sizeof(buf) - 1) {
|
||||
if (read(ifd, buf + i, 1) != 1) {
|
||||
break;
|
||||
}
|
||||
if (buf[i] == 'R') {
|
||||
break;
|
||||
}
|
||||
i++;
|
||||
}
|
||||
buf[i] = '\0';
|
||||
|
||||
/* Parse it. */
|
||||
if (buf[0] != ESC || buf[1] != '[') {
|
||||
return ws;
|
||||
}
|
||||
int offset = 2;
|
||||
int new_offset;
|
||||
ws.ws_row = parseInt(buf + offset, &new_offset);
|
||||
offset += new_offset;
|
||||
if (buf[offset] != ';') {
|
||||
return ws;
|
||||
}
|
||||
offset++;
|
||||
ws.ws_col = parseInt(buf + offset);
|
||||
return ws;
|
||||
}
|
||||
|
||||
TerminalSize Terminal::TryMeasureTerminalSize() {
|
||||
int ofd = STDOUT_FILENO;
|
||||
/* ioctl() failed. Try to query the terminal itself. */
|
||||
TerminalSize start, result;
|
||||
|
||||
/* Get the initial position so we can restore it later. */
|
||||
start = GetCursorPosition();
|
||||
if (!start.ws_col) {
|
||||
return result;
|
||||
}
|
||||
|
||||
/* Go to bottom-right margin */
|
||||
if (write(ofd, "\x1b[999;999f", 10) != 10) {
|
||||
return result;
|
||||
}
|
||||
result = GetCursorPosition();
|
||||
if (!result.ws_col) {
|
||||
return result;
|
||||
}
|
||||
|
||||
/* Restore position. */
|
||||
char seq[32];
|
||||
snprintf(seq, 32, "\x1b[%d;%df", start.ws_row, start.ws_col);
|
||||
if (write(ofd, seq, strlen(seq)) == -1) {
|
||||
/* Can't recover... */
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
/* Try to get the number of columns in the current terminal, or assume 80
|
||||
* if it fails. */
|
||||
TerminalSize Terminal::GetTerminalSize() {
|
||||
TerminalSize result;
|
||||
|
||||
// try ioctl first
|
||||
{
|
||||
struct winsize ws;
|
||||
ioctl(1, TIOCGWINSZ, &ws);
|
||||
result.ws_col = ws.ws_col;
|
||||
result.ws_row = ws.ws_row;
|
||||
}
|
||||
// try ROWS and COLUMNS env variables
|
||||
if (!result.ws_col) {
|
||||
result.ws_col = tryParseEnv("COLUMNS");
|
||||
}
|
||||
if (!result.ws_row) {
|
||||
result.ws_row = tryParseEnv("ROWS");
|
||||
}
|
||||
// if those fail measure the size by moving the cursor to the corner and fetching the position
|
||||
if (!result.ws_col || !result.ws_row) {
|
||||
TerminalSize measured_size = TryMeasureTerminalSize();
|
||||
Linenoise::Log("measured size col %d,row %d -- ", measured_size.ws_row, measured_size.ws_col);
|
||||
if (measured_size.ws_row) {
|
||||
result.ws_row = measured_size.ws_row;
|
||||
}
|
||||
if (measured_size.ws_col) {
|
||||
result.ws_col = measured_size.ws_col;
|
||||
}
|
||||
}
|
||||
// if all else fails use defaults (80,24)
|
||||
if (!result.ws_col) {
|
||||
result.ws_col = 80;
|
||||
}
|
||||
if (!result.ws_row) {
|
||||
result.ws_row = 24;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
/* Clear the screen. Used to handle ctrl+l */
|
||||
void Terminal::ClearScreen() {
|
||||
if (write(STDOUT_FILENO, "\x1b[H\x1b[2J", 7) <= 0) {
|
||||
/* nothing to do, just to avoid warning. */
|
||||
}
|
||||
}
|
||||
|
||||
/* Beep, used for completion when there is nothing to complete or when all
|
||||
* the choices were already shown. */
|
||||
void Terminal::Beep() {
|
||||
fprintf(stderr, "\x7");
|
||||
fflush(stderr);
|
||||
}
|
||||
|
||||
EscapeSequence Terminal::ReadEscapeSequence(int ifd) {
|
||||
char seq[5];
|
||||
idx_t length = ReadEscapeSequence(ifd, seq);
|
||||
if (length == 0) {
|
||||
return EscapeSequence::INVALID;
|
||||
}
|
||||
Linenoise::Log("escape of length %d\n", length);
|
||||
switch (length) {
|
||||
case 1:
|
||||
if (seq[0] >= 'a' && seq[0] <= 'z') {
|
||||
return EscapeSequence(idx_t(EscapeSequence::ALT_A) + (seq[0] - 'a'));
|
||||
}
|
||||
if (seq[0] >= 'A' && seq[0] <= 'Z') {
|
||||
return EscapeSequence(idx_t(EscapeSequence::ALT_A) + (seq[0] - 'A'));
|
||||
}
|
||||
switch (seq[0]) {
|
||||
case BACKSPACE:
|
||||
return EscapeSequence::ALT_BACKSPACE;
|
||||
case ESC:
|
||||
return EscapeSequence::ESCAPE;
|
||||
case '<':
|
||||
return EscapeSequence::ALT_LEFT_ARROW;
|
||||
case '>':
|
||||
return EscapeSequence::ALT_RIGHT_ARROW;
|
||||
case '\\':
|
||||
return EscapeSequence::ALT_BACKSLASH;
|
||||
default:
|
||||
Linenoise::Log("unrecognized escape sequence of length 1 - %d\n", seq[0]);
|
||||
break;
|
||||
}
|
||||
break;
|
||||
case 2:
|
||||
if (seq[0] == 'O') {
|
||||
switch (seq[1]) {
|
||||
case 'A': /* Up */
|
||||
return EscapeSequence::UP;
|
||||
case 'B': /* Down */
|
||||
return EscapeSequence::DOWN;
|
||||
case 'C': /* Right */
|
||||
return EscapeSequence::RIGHT;
|
||||
case 'D': /* Left */
|
||||
return EscapeSequence::LEFT;
|
||||
case 'H': /* Home */
|
||||
return EscapeSequence::HOME;
|
||||
case 'F': /* End*/
|
||||
return EscapeSequence::END;
|
||||
case 'c':
|
||||
return EscapeSequence::ALT_F;
|
||||
case 'd':
|
||||
return EscapeSequence::ALT_B;
|
||||
default:
|
||||
Linenoise::Log("unrecognized escape sequence (O) %d\n", seq[1]);
|
||||
break;
|
||||
}
|
||||
} else if (seq[0] == '[') {
|
||||
switch (seq[1]) {
|
||||
case 'A': /* Up */
|
||||
return EscapeSequence::UP;
|
||||
case 'B': /* Down */
|
||||
return EscapeSequence::DOWN;
|
||||
case 'C': /* Right */
|
||||
return EscapeSequence::RIGHT;
|
||||
case 'D': /* Left */
|
||||
return EscapeSequence::LEFT;
|
||||
case 'H': /* Home */
|
||||
return EscapeSequence::HOME;
|
||||
case 'F': /* End*/
|
||||
return EscapeSequence::END;
|
||||
case 'Z': /* Shift Tab */
|
||||
return EscapeSequence::SHIFT_TAB;
|
||||
default:
|
||||
Linenoise::Log("unrecognized escape sequence (seq[1]) %d\n", seq[1]);
|
||||
break;
|
||||
}
|
||||
} else {
|
||||
Linenoise::Log("unrecognized escape sequence of length %d (%d %d)\n", length, seq[0], seq[1]);
|
||||
}
|
||||
break;
|
||||
case 3:
|
||||
if (seq[2] == '~') {
|
||||
switch (seq[1]) {
|
||||
case '1':
|
||||
return EscapeSequence::HOME;
|
||||
case '3': /* Delete key. */
|
||||
return EscapeSequence::DELETE;
|
||||
case '4':
|
||||
case '8':
|
||||
return EscapeSequence::END;
|
||||
default:
|
||||
Linenoise::Log("unrecognized escape sequence (~) %d\n", seq[1]);
|
||||
break;
|
||||
}
|
||||
} else if (seq[1] == '5' && seq[2] == 'C') {
|
||||
return EscapeSequence::ALT_F;
|
||||
} else if (seq[1] == '5' && seq[2] == 'D') {
|
||||
return EscapeSequence::ALT_B;
|
||||
} else {
|
||||
Linenoise::Log("unrecognized escape sequence of length %d\n", length);
|
||||
}
|
||||
break;
|
||||
case 5:
|
||||
if (memcmp(seq, "[1;5C", 5) == 0 || memcmp(seq, "[1;3C", 5) == 0) {
|
||||
// [1;5C: move word right
|
||||
return EscapeSequence::CTRL_MOVE_FORWARDS;
|
||||
} else if (memcmp(seq, "[1;5D", 5) == 0 || memcmp(seq, "[1;3D", 5) == 0) {
|
||||
// [1;5D: move word left
|
||||
return EscapeSequence::CTRL_MOVE_BACKWARDS;
|
||||
} else {
|
||||
Linenoise::Log("unrecognized escape sequence (;) %d\n", seq[1]);
|
||||
}
|
||||
break;
|
||||
default:
|
||||
Linenoise::Log("unrecognized escape sequence of length %d\n", length);
|
||||
break;
|
||||
}
|
||||
return EscapeSequence::UNKNOWN;
|
||||
}
|
||||
|
||||
idx_t Terminal::ReadEscapeSequence(int ifd, char seq[]) {
|
||||
if (read(ifd, seq, 1) == -1) {
|
||||
return 0;
|
||||
}
|
||||
switch (seq[0]) {
|
||||
case 'O':
|
||||
case '[':
|
||||
// these characters have multiple bytes following them
|
||||
break;
|
||||
default:
|
||||
return 1;
|
||||
}
|
||||
if (read(ifd, seq + 1, 1) == -1) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (seq[0] != '[') {
|
||||
return 2;
|
||||
}
|
||||
if (seq[1] < '0' || seq[1] > '9') {
|
||||
return 2;
|
||||
}
|
||||
/* Extended escape, read additional byte. */
|
||||
if (read(ifd, seq + 2, 1) == -1) {
|
||||
return 0;
|
||||
}
|
||||
if (seq[2] == ';') {
|
||||
// read 2 extra bytes
|
||||
if (read(ifd, seq + 3, 2) == -1) {
|
||||
return 0;
|
||||
}
|
||||
return 5;
|
||||
} else {
|
||||
return 3;
|
||||
}
|
||||
}
|
||||
|
||||
} // namespace duckdb
|
||||
40
external/duckdb/tools/shell/rc/duckdb.rc
vendored
Normal file
40
external/duckdb/tools/shell/rc/duckdb.rc
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
#include <windows.h>
|
||||
|
||||
#define Q(x) #x
|
||||
#define QUOTE(x) Q(x)
|
||||
|
||||
VS_VERSION_INFO VERSIONINFO
|
||||
FILEVERSION DUCKDB_MAJOR_VERSION,DUCKDB_MINOR_VERSION,DUCKDB_PATCH_VERSION,DUCKDB_DEV_ITERATION
|
||||
PRODUCTVERSION DUCKDB_MAJOR_VERSION,DUCKDB_MINOR_VERSION,DUCKDB_PATCH_VERSION,DUCKDB_DEV_ITERATION
|
||||
#ifdef DEBUG
|
||||
FILEFLAGSMASK VS_FF_DEBUG | VS_FF_PRERELEASE
|
||||
#else
|
||||
FILEFLAGSMASK 0
|
||||
#endif
|
||||
FILEOS VOS_NT_WINDOWS32
|
||||
FILETYPE VFT_APP
|
||||
BEGIN
|
||||
BLOCK "StringFileInfo"
|
||||
BEGIN
|
||||
BLOCK "040904b0"
|
||||
BEGIN
|
||||
VALUE "Comments", "DuckDB shell"
|
||||
VALUE "CompanyName", "DuckDB Labs"
|
||||
|
||||
VALUE "FileDescription", "DuckDB shell"
|
||||
VALUE "FileVersion", QUOTE(DUCKDB_VERSION)
|
||||
VALUE "InternalName", "DuckDB shell"
|
||||
VALUE "LegalCopyright", "Copyright 2018-" QUOTE(DUCKDB_COPYRIGHT_YEAR) " Stichting DuckDB Foundation"
|
||||
|
||||
VALUE "OriginalFilename", "duckdb.exe"
|
||||
VALUE "ProductName", "DuckDB"
|
||||
VALUE "ProductVersion", QUOTE(DUCKDB_VERSION)
|
||||
END
|
||||
END
|
||||
BLOCK "VarFileInfo"
|
||||
BEGIN
|
||||
VALUE "Translation", 0x409, 1252
|
||||
END
|
||||
END
|
||||
|
||||
MAINICON ICON "../../../logo/DuckDB.ico"
|
||||
5357
external/duckdb/tools/shell/shell.cpp
vendored
Normal file
5357
external/duckdb/tools/shell/shell.cpp
vendored
Normal file
File diff suppressed because it is too large
Load Diff
294
external/duckdb/tools/shell/shell_highlight.cpp
vendored
Normal file
294
external/duckdb/tools/shell/shell_highlight.cpp
vendored
Normal file
@@ -0,0 +1,294 @@
|
||||
#include "shell_highlight.hpp"
|
||||
#include "shell_state.hpp"
|
||||
#include "duckdb/parser/parser.hpp"
|
||||
|
||||
#if defined(_WIN32) || defined(WIN32)
|
||||
#include <windows.h>
|
||||
#endif
|
||||
|
||||
namespace duckdb_shell {
|
||||
|
||||
struct HighlightElement {
|
||||
const char *name;
|
||||
PrintColor color;
|
||||
PrintIntensity intensity;
|
||||
};
|
||||
|
||||
static HighlightElement highlight_elements[] = {{"error", PrintColor::RED, PrintIntensity::BOLD},
|
||||
{"keyword", PrintColor::GREEN, PrintIntensity::STANDARD},
|
||||
{"numeric_constant", PrintColor::YELLOW, PrintIntensity::STANDARD},
|
||||
{"string_constant", PrintColor::YELLOW, PrintIntensity::STANDARD},
|
||||
{"line_indicator", PrintColor::STANDARD, PrintIntensity::BOLD},
|
||||
{"column_name", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{"column_type", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{"numeric_value", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{"string_value", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{"temporal_value", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{"null_value", PrintColor::GRAY, PrintIntensity::STANDARD},
|
||||
{"footer", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{"layout", PrintColor::GRAY, PrintIntensity::STANDARD},
|
||||
{"none", PrintColor::STANDARD, PrintIntensity::STANDARD},
|
||||
{nullptr, PrintColor::STANDARD, PrintIntensity::STANDARD}};
|
||||
|
||||
struct HighlightColors {
|
||||
const char *name;
|
||||
PrintColor color;
|
||||
};
|
||||
|
||||
static const HighlightColors highlight_colors[] = {{"standard", PrintColor::STANDARD}, {"red", PrintColor::RED},
|
||||
{"yellow", PrintColor::YELLOW}, {"green", PrintColor::GREEN},
|
||||
{"gray", PrintColor::GRAY}, {"blue", PrintColor::BLUE},
|
||||
{"magenta", PrintColor::MAGENTA}, {"cyan", PrintColor::CYAN},
|
||||
{"white", PrintColor::WHITE}, {nullptr, PrintColor::STANDARD}};
|
||||
|
||||
ShellHighlight::ShellHighlight(ShellState &state) : state(state) {
|
||||
}
|
||||
|
||||
/*
|
||||
** Output text to the console in a font that attracts extra attention.
|
||||
*/
|
||||
#ifdef _WIN32
|
||||
void ShellHighlight::PrintText(const string &text, PrintOutput output, PrintColor color, PrintIntensity intensity) {
|
||||
HANDLE out = GetStdHandle(output == PrintOutput::STDOUT ? STD_OUTPUT_HANDLE : STD_ERROR_HANDLE);
|
||||
CONSOLE_SCREEN_BUFFER_INFO defaultScreenInfo;
|
||||
GetConsoleScreenBufferInfo(out, &defaultScreenInfo);
|
||||
WORD wAttributes = 0;
|
||||
|
||||
switch (intensity) {
|
||||
case PrintIntensity::BOLD:
|
||||
case PrintIntensity::BOLD_UNDERLINE:
|
||||
wAttributes |= FOREGROUND_INTENSITY;
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
switch (color) {
|
||||
case PrintColor::RED:
|
||||
wAttributes |= FOREGROUND_RED;
|
||||
break;
|
||||
case PrintColor::GREEN:
|
||||
wAttributes |= FOREGROUND_GREEN;
|
||||
break;
|
||||
case PrintColor::BLUE:
|
||||
wAttributes |= FOREGROUND_BLUE;
|
||||
break;
|
||||
case PrintColor::YELLOW:
|
||||
wAttributes |= FOREGROUND_RED | FOREGROUND_GREEN;
|
||||
break;
|
||||
case PrintColor::GRAY:
|
||||
wAttributes |= FOREGROUND_RED | FOREGROUND_GREEN | FOREGROUND_BLUE;
|
||||
break;
|
||||
case PrintColor::MAGENTA:
|
||||
wAttributes |= FOREGROUND_BLUE | FOREGROUND_RED;
|
||||
break;
|
||||
case PrintColor::CYAN:
|
||||
wAttributes |= FOREGROUND_BLUE | FOREGROUND_GREEN;
|
||||
break;
|
||||
case PrintColor::WHITE:
|
||||
wAttributes |= FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED;
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
if (wAttributes != 0) {
|
||||
SetConsoleTextAttribute(out, wAttributes);
|
||||
}
|
||||
|
||||
state.Print(output, text);
|
||||
|
||||
SetConsoleTextAttribute(out, defaultScreenInfo.wAttributes);
|
||||
}
|
||||
#else
|
||||
void ShellHighlight::PrintText(const string &text, PrintOutput output, PrintColor color, PrintIntensity intensity) {
|
||||
const char *bold_prefix = "";
|
||||
const char *color_prefix = "";
|
||||
const char *suffix = "";
|
||||
switch (intensity) {
|
||||
case PrintIntensity::BOLD:
|
||||
bold_prefix = "\033[1m";
|
||||
break;
|
||||
case PrintIntensity::UNDERLINE:
|
||||
bold_prefix = "\033[4m";
|
||||
break;
|
||||
case PrintIntensity::BOLD_UNDERLINE:
|
||||
bold_prefix = "\033[1m\033[4m";
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
switch (color) {
|
||||
case PrintColor::RED:
|
||||
color_prefix = "\033[31m";
|
||||
break;
|
||||
case PrintColor::GREEN:
|
||||
color_prefix = "\033[32m";
|
||||
break;
|
||||
case PrintColor::YELLOW:
|
||||
color_prefix = "\033[33m";
|
||||
break;
|
||||
case PrintColor::GRAY:
|
||||
color_prefix = "\033[90m";
|
||||
break;
|
||||
case PrintColor::BLUE:
|
||||
color_prefix = "\033[34m";
|
||||
break;
|
||||
case PrintColor::MAGENTA:
|
||||
color_prefix = "\033[35m";
|
||||
break;
|
||||
case PrintColor::CYAN:
|
||||
color_prefix = "\033[36m";
|
||||
break;
|
||||
case PrintColor::WHITE:
|
||||
color_prefix = "\033[37m";
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
if (*color_prefix || *bold_prefix) {
|
||||
suffix = "\033[0m";
|
||||
}
|
||||
fprintf(output == PrintOutput::STDOUT ? state.out : stderr, "%s%s%s%s", bold_prefix, color_prefix, text.c_str(),
|
||||
suffix);
|
||||
}
|
||||
#endif
|
||||
|
||||
void ShellHighlight::PrintText(const string &text, PrintOutput output, HighlightElementType type) {
|
||||
auto index = static_cast<uint32_t>(type);
|
||||
auto max_index = static_cast<uint32_t>(HighlightElementType::NONE);
|
||||
if (index > max_index) {
|
||||
index = max_index;
|
||||
}
|
||||
auto highlight_info = highlight_elements[index];
|
||||
PrintText(text, output, highlight_info.color, highlight_info.intensity);
|
||||
}
|
||||
|
||||
void ShellHighlight::PrintError(string error_msg) {
|
||||
if (error_msg.empty()) {
|
||||
return;
|
||||
}
|
||||
vector<duckdb::SimplifiedToken> tokens;
|
||||
string error_type;
|
||||
auto error_location = duckdb::StringUtil::Find(error_msg, "Error: ");
|
||||
if (error_location.IsValid()) {
|
||||
error_type = error_msg.substr(0, error_location.GetIndex() + 6);
|
||||
error_msg = error_msg.substr(error_location.GetIndex() + 7);
|
||||
}
|
||||
try {
|
||||
tokens = duckdb::Parser::TokenizeError(error_msg);
|
||||
} catch (...) {
|
||||
// fallback
|
||||
state.Print(PrintOutput::STDERR, error_msg.c_str());
|
||||
state.Print(PrintOutput::STDERR, "\n");
|
||||
return;
|
||||
}
|
||||
if (!tokens.empty() && tokens[0].start > 0) {
|
||||
duckdb::SimplifiedToken new_token;
|
||||
new_token.type = duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_IDENTIFIER;
|
||||
new_token.start = 0;
|
||||
tokens.insert(tokens.begin(), new_token);
|
||||
}
|
||||
if (tokens.empty() && !error_msg.empty()) {
|
||||
duckdb::SimplifiedToken new_token;
|
||||
new_token.type = duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_IDENTIFIER;
|
||||
new_token.start = 0;
|
||||
tokens.push_back(new_token);
|
||||
}
|
||||
if (!error_type.empty()) {
|
||||
PrintText(error_type + "\n", PrintOutput::STDERR, HighlightElementType::ERROR_TOKEN);
|
||||
}
|
||||
for (idx_t i = 0; i < tokens.size(); i++) {
|
||||
HighlightElementType element_type = HighlightElementType::NONE;
|
||||
switch (tokens[i].type) {
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_IDENTIFIER:
|
||||
break;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_ERROR:
|
||||
element_type = HighlightElementType::ERROR_TOKEN;
|
||||
break;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_NUMERIC_CONSTANT:
|
||||
element_type = HighlightElementType::NUMERIC_CONSTANT;
|
||||
break;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_STRING_CONSTANT:
|
||||
element_type = HighlightElementType::STRING_CONSTANT;
|
||||
break;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_OPERATOR:
|
||||
break;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_KEYWORD:
|
||||
element_type = HighlightElementType::KEYWORD;
|
||||
break;
|
||||
case duckdb::SimplifiedTokenType::SIMPLIFIED_TOKEN_COMMENT:
|
||||
element_type = HighlightElementType::LINE_INDICATOR;
|
||||
break;
|
||||
}
|
||||
idx_t start = tokens[i].start;
|
||||
idx_t end = i + 1 == tokens.size() ? error_msg.size() : tokens[i + 1].start;
|
||||
if (end - start > 0) {
|
||||
string error_print = error_msg.substr(tokens[i].start, end - start);
|
||||
PrintText(error_print, PrintOutput::STDERR, element_type);
|
||||
}
|
||||
}
|
||||
PrintText("\n", PrintOutput::STDERR, PrintColor::STANDARD, PrintIntensity::STANDARD);
|
||||
}
|
||||
|
||||
bool ShellHighlight::SetColor(const char *element_type, const char *color, const char *intensity) {
|
||||
idx_t i;
|
||||
for (i = 0; highlight_elements[i].name; i++) {
|
||||
if (duckdb::StringUtil::CIEquals(element_type, highlight_elements[i].name)) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!highlight_elements[i].name) {
|
||||
// element not found
|
||||
string supported_options;
|
||||
for (i = 0; highlight_elements[i].name; i++) {
|
||||
if (!supported_options.empty()) {
|
||||
supported_options += ", ";
|
||||
}
|
||||
supported_options += highlight_elements[i].name;
|
||||
}
|
||||
state.Print(PrintOutput::STDERR, duckdb::StringUtil::Format("Unknown element '%s', supported options: %s\n",
|
||||
element_type, supported_options.c_str()));
|
||||
return false;
|
||||
}
|
||||
|
||||
// found the element - parse the color
|
||||
idx_t c;
|
||||
for (c = 0; highlight_colors[c].name; c++) {
|
||||
if (duckdb::StringUtil::CIEquals(color, highlight_colors[c].name)) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!highlight_colors[c].name) {
|
||||
// color not found
|
||||
string supported_options;
|
||||
for (c = 0; highlight_colors[c].name; c++) {
|
||||
if (!supported_options.empty()) {
|
||||
supported_options += ", ";
|
||||
}
|
||||
supported_options += highlight_colors[c].name;
|
||||
}
|
||||
state.Print(PrintOutput::STDERR, duckdb::StringUtil::Format("Unknown color '%s', supported options: %s\n",
|
||||
color, supported_options.c_str()));
|
||||
return false;
|
||||
}
|
||||
highlight_elements[i].color = highlight_colors[c].color;
|
||||
highlight_elements[i].intensity = PrintIntensity::STANDARD;
|
||||
if (intensity) {
|
||||
if (duckdb::StringUtil::CIEquals(intensity, "standard")) {
|
||||
highlight_elements[i].intensity = PrintIntensity::STANDARD;
|
||||
} else if (duckdb::StringUtil::CIEquals(intensity, "bold")) {
|
||||
highlight_elements[i].intensity = PrintIntensity::BOLD;
|
||||
} else if (duckdb::StringUtil::CIEquals(intensity, "underline")) {
|
||||
highlight_elements[i].intensity = PrintIntensity::UNDERLINE;
|
||||
} else if (duckdb::StringUtil::CIEquals(intensity, "bold_underline")) {
|
||||
highlight_elements[i].intensity = PrintIntensity::BOLD_UNDERLINE;
|
||||
} else {
|
||||
state.Print(PrintOutput::STDERR,
|
||||
duckdb::StringUtil::Format(
|
||||
"Unknown intensity '%s', supported options: standard, bold, underline\n", intensity));
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
} // namespace duckdb_shell
|
||||
844
external/duckdb/tools/shell/shell_renderer.cpp
vendored
Normal file
844
external/duckdb/tools/shell/shell_renderer.cpp
vendored
Normal file
@@ -0,0 +1,844 @@
|
||||
#include "shell_renderer.hpp"
|
||||
|
||||
#include "shell_state.hpp"
|
||||
#include "duckdb_shell_wrapper.h"
|
||||
#include "sqlite3.h"
|
||||
#include <stdexcept>
|
||||
#include <cstring>
|
||||
|
||||
namespace duckdb_shell {
|
||||
|
||||
bool ShellRenderer::IsColumnar(RenderMode mode) {
|
||||
switch (mode) {
|
||||
case RenderMode::COLUMN:
|
||||
case RenderMode::TABLE:
|
||||
case RenderMode::BOX:
|
||||
case RenderMode::MARKDOWN:
|
||||
case RenderMode::LATEX:
|
||||
return true;
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
ShellRenderer::ShellRenderer(ShellState &state)
|
||||
: state(state), show_header(state.showHeader), col_sep(state.colSeparator), row_sep(state.rowSeparator) {
|
||||
}
|
||||
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Column Renderers
|
||||
//===--------------------------------------------------------------------===//
|
||||
ColumnRenderer::ColumnRenderer(ShellState &state) : ShellRenderer(state) {
|
||||
}
|
||||
|
||||
void ColumnRenderer::RenderFooter(ColumnarResult &result) {
|
||||
}
|
||||
|
||||
void ColumnRenderer::RenderAlignedValue(ColumnarResult &result, idx_t i) {
|
||||
idx_t w = result.column_width[i];
|
||||
idx_t n = state.RenderLength(result.data[i]);
|
||||
state.PrintPadded("", (w - n) / 2);
|
||||
state.Print(result.data[i]);
|
||||
state.PrintPadded("", (w - n + 1) / 2);
|
||||
}
|
||||
|
||||
class ModeColumnRenderer : public ColumnRenderer {
|
||||
public:
|
||||
explicit ModeColumnRenderer(ShellState &state) : ColumnRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(ColumnarResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
state.UTF8WidthPrint(state.out, result.column_width[i], result.data[i], result.right_align[i]);
|
||||
state.Print(i == result.column_count - 1 ? "\n" : " ");
|
||||
}
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
state.PrintDashes(result.column_width[i]);
|
||||
state.Print(i == result.column_count - 1 ? "\n" : " ");
|
||||
}
|
||||
}
|
||||
|
||||
const char *GetColumnSeparator() override {
|
||||
return " ";
|
||||
}
|
||||
const char *GetRowSeparator() override {
|
||||
return "\n";
|
||||
}
|
||||
};
|
||||
|
||||
class ModeTableRenderer : public ColumnRenderer {
|
||||
public:
|
||||
explicit ModeTableRenderer(ShellState &state) : ColumnRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(ColumnarResult &result) override {
|
||||
state.PrintRowSeparator(result.column_count, "+", result.column_width);
|
||||
state.Print("| ");
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
RenderAlignedValue(result, i);
|
||||
state.Print(i == result.column_count - 1 ? " |\n" : " | ");
|
||||
}
|
||||
state.PrintRowSeparator(result.column_count, "+", result.column_width);
|
||||
}
|
||||
|
||||
void RenderFooter(ColumnarResult &result) override {
|
||||
state.PrintRowSeparator(result.column_count, "+", result.column_width);
|
||||
}
|
||||
|
||||
const char *GetColumnSeparator() override {
|
||||
return " | ";
|
||||
}
|
||||
const char *GetRowSeparator() override {
|
||||
return " |\n";
|
||||
}
|
||||
const char *GetRowStart() override {
|
||||
return "| ";
|
||||
}
|
||||
};
|
||||
|
||||
class ModeMarkdownRenderer : public ColumnRenderer {
|
||||
public:
|
||||
explicit ModeMarkdownRenderer(ShellState &state) : ColumnRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(ColumnarResult &result) override {
|
||||
state.Print(GetRowStart());
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
if (i > 0) {
|
||||
state.Print(GetColumnSeparator());
|
||||
}
|
||||
RenderAlignedValue(result, i);
|
||||
}
|
||||
state.Print(GetRowSeparator());
|
||||
state.PrintMarkdownSeparator(result.column_count, "|", result.types, result.column_width);
|
||||
}
|
||||
|
||||
const char *GetColumnSeparator() override {
|
||||
return " | ";
|
||||
}
|
||||
const char *GetRowSeparator() override {
|
||||
return " |\n";
|
||||
}
|
||||
const char *GetRowStart() override {
|
||||
return "| ";
|
||||
}
|
||||
};
|
||||
|
||||
/*
|
||||
** UTF8 box-drawing characters. Imagine box lines like this:
|
||||
**
|
||||
** 1
|
||||
** |
|
||||
** 4 --+-- 2
|
||||
** |
|
||||
** 3
|
||||
**
|
||||
** Each box characters has between 2 and 4 of the lines leading from
|
||||
** the center. The characters are here identified by the numbers of
|
||||
** their corresponding lines.
|
||||
*/
|
||||
#define BOX_24 "\342\224\200" /* U+2500 --- */
|
||||
#define BOX_13 "\342\224\202" /* U+2502 | */
|
||||
#define BOX_23 "\342\224\214" /* U+250c ,- */
|
||||
#define BOX_34 "\342\224\220" /* U+2510 -, */
|
||||
#define BOX_12 "\342\224\224" /* U+2514 '- */
|
||||
#define BOX_14 "\342\224\230" /* U+2518 -' */
|
||||
#define BOX_123 "\342\224\234" /* U+251c |- */
|
||||
#define BOX_134 "\342\224\244" /* U+2524 -| */
|
||||
#define BOX_234 "\342\224\254" /* U+252c -,- */
|
||||
#define BOX_124 "\342\224\264" /* U+2534 -'- */
|
||||
#define BOX_1234 "\342\224\274" /* U+253c -|- */
|
||||
|
||||
class ModeBoxRenderer : public ColumnRenderer {
|
||||
public:
|
||||
explicit ModeBoxRenderer(ShellState &state) : ColumnRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(ColumnarResult &result) override {
|
||||
print_box_row_separator(result.column_count, BOX_23, BOX_234, BOX_34, result.column_width);
|
||||
state.Print(BOX_13 " ");
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
RenderAlignedValue(result, i);
|
||||
state.Print(i == result.column_count - 1 ? " " BOX_13 "\n" : " " BOX_13 " ");
|
||||
}
|
||||
print_box_row_separator(result.column_count, BOX_123, BOX_1234, BOX_134, result.column_width);
|
||||
}
|
||||
|
||||
void RenderFooter(ColumnarResult &result) override {
|
||||
print_box_row_separator(result.column_count, BOX_12, BOX_124, BOX_14, result.column_width);
|
||||
}
|
||||
|
||||
const char *GetColumnSeparator() override {
|
||||
return " " BOX_13 " ";
|
||||
}
|
||||
const char *GetRowSeparator() override {
|
||||
return " " BOX_13 "\n";
|
||||
}
|
||||
const char *GetRowStart() override {
|
||||
return BOX_13 " ";
|
||||
}
|
||||
|
||||
private:
|
||||
/* Draw horizontal line N characters long using unicode box
|
||||
** characters
|
||||
*/
|
||||
void print_box_line(idx_t N) {
|
||||
string box_line;
|
||||
for (idx_t i = 0; i < N; i++) {
|
||||
box_line += BOX_24;
|
||||
}
|
||||
state.Print(box_line);
|
||||
}
|
||||
|
||||
/*
|
||||
** Draw a horizontal separator for a RenderMode::Box table.
|
||||
*/
|
||||
void print_box_row_separator(int nArg, const char *zSep1, const char *zSep2, const char *zSep3,
|
||||
const vector<idx_t> &actualWidth) {
|
||||
int i;
|
||||
if (nArg > 0) {
|
||||
state.Print(zSep1);
|
||||
print_box_line(actualWidth[0] + 2);
|
||||
for (i = 1; i < nArg; i++) {
|
||||
state.Print(zSep2);
|
||||
print_box_line(actualWidth[i] + 2);
|
||||
}
|
||||
state.Print(zSep3);
|
||||
}
|
||||
state.Print("\n");
|
||||
}
|
||||
};
|
||||
|
||||
class ModeLatexRenderer : public ColumnRenderer {
|
||||
public:
|
||||
explicit ModeLatexRenderer(ShellState &state) : ColumnRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(ColumnarResult &result) override {
|
||||
state.Print("\\begin{tabular}{|");
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
if (state.ColumnTypeIsInteger(result.type_names[i])) {
|
||||
state.Print("r");
|
||||
} else {
|
||||
state.Print("l");
|
||||
}
|
||||
}
|
||||
state.Print("|}\n");
|
||||
state.Print("\\hline\n");
|
||||
for (idx_t i = 0; i < result.column_count; i++) {
|
||||
RenderAlignedValue(result, i);
|
||||
state.Print(i == result.column_count - 1 ? GetRowSeparator() : GetColumnSeparator());
|
||||
}
|
||||
state.Print("\\hline\n");
|
||||
}
|
||||
|
||||
void RenderFooter(ColumnarResult &) override {
|
||||
state.Print("\\hline\n");
|
||||
state.Print("\\end{tabular}\n");
|
||||
}
|
||||
|
||||
const char *GetColumnSeparator() override {
|
||||
return " & ";
|
||||
}
|
||||
const char *GetRowSeparator() override {
|
||||
return " \\\\\n";
|
||||
}
|
||||
};
|
||||
|
||||
unique_ptr<ColumnRenderer> ShellState::GetColumnRenderer() {
|
||||
switch (cMode) {
|
||||
case RenderMode::COLUMN:
|
||||
return unique_ptr<ColumnRenderer>(new ModeColumnRenderer(*this));
|
||||
case RenderMode::TABLE:
|
||||
return unique_ptr<ColumnRenderer>(new ModeTableRenderer(*this));
|
||||
case RenderMode::MARKDOWN:
|
||||
return unique_ptr<ColumnRenderer>(new ModeMarkdownRenderer(*this));
|
||||
case RenderMode::BOX:
|
||||
return unique_ptr<ColumnRenderer>(new ModeBoxRenderer(*this));
|
||||
case RenderMode::LATEX:
|
||||
return unique_ptr<ColumnRenderer>(new ModeLatexRenderer(*this));
|
||||
default:
|
||||
throw std::runtime_error("Unsupported mode for GetColumnRenderer");
|
||||
}
|
||||
}
|
||||
|
||||
//===--------------------------------------------------------------------===//
|
||||
// Row Renderers
|
||||
//===--------------------------------------------------------------------===//
|
||||
RowRenderer::RowRenderer(ShellState &state) : ShellRenderer(state) {
|
||||
}
|
||||
|
||||
void RowRenderer::Render(RowResult &result) {
|
||||
if (first_row) {
|
||||
RenderHeader(result);
|
||||
first_row = false;
|
||||
}
|
||||
RenderRow(result);
|
||||
}
|
||||
|
||||
void RowRenderer::RenderHeader(RowResult &result) {
|
||||
}
|
||||
|
||||
void RowRenderer::RenderFooter(RowResult &result) {
|
||||
}
|
||||
|
||||
class ModeLineRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeLineRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void Render(RowResult &result) override {
|
||||
if (first_row) {
|
||||
auto &col_names = result.column_names;
|
||||
// determine the render width by going over the column names
|
||||
header_width = 5;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
auto len = ShellState::StringLength(col_names[i] ? col_names[i] : "");
|
||||
if (len > header_width) {
|
||||
header_width = len;
|
||||
}
|
||||
}
|
||||
first_row = false;
|
||||
} else {
|
||||
state.Print(state.rowSeparator);
|
||||
}
|
||||
// render the row
|
||||
RenderRow(result);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
state.PrintPadded(col_names[i], header_width);
|
||||
state.Print(" = ");
|
||||
state.PrintValue(data[i]);
|
||||
state.Print(state.rowSeparator);
|
||||
}
|
||||
}
|
||||
|
||||
idx_t header_width = 0;
|
||||
};
|
||||
|
||||
class ModeExplainRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeExplainRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
if (data.size() != 2) {
|
||||
return;
|
||||
}
|
||||
if (strcmp(data[0], "logical_plan") == 0 || strcmp(data[0], "logical_opt") == 0 ||
|
||||
strcmp(data[0], "physical_plan") == 0) {
|
||||
state.Print("\n┌─────────────────────────────┐\n");
|
||||
state.Print("│┌───────────────────────────┐│\n");
|
||||
if (strcmp(data[0], "logical_plan") == 0) {
|
||||
state.Print("││ Unoptimized Logical Plan ││\n");
|
||||
} else if (strcmp(data[0], "logical_opt") == 0) {
|
||||
state.Print("││ Optimized Logical Plan ││\n");
|
||||
} else if (strcmp(data[0], "physical_plan") == 0) {
|
||||
state.Print("││ Physical Plan ││\n");
|
||||
}
|
||||
state.Print("│└───────────────────────────┘│\n");
|
||||
state.Print("└─────────────────────────────┘\n");
|
||||
}
|
||||
state.Print(data[1]);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeListRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeListRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(RowResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.Print(col_names[i]);
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.PrintValue(data[i]);
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeHtmlRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeHtmlRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(RowResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
auto &col_names = result.column_names;
|
||||
state.Print("<tr>");
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
state.Print("<th>");
|
||||
output_html_string(col_names[i]);
|
||||
state.Print("</th>\n");
|
||||
}
|
||||
state.Print("</tr>\n");
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
state.Print("<tr>");
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
state.Print("<td>");
|
||||
output_html_string(data[i] ? data[i] : state.nullValue.c_str());
|
||||
state.Print("</td>\n");
|
||||
}
|
||||
state.Print("</tr>\n");
|
||||
}
|
||||
|
||||
/*
|
||||
** Output the given string with characters that are special to
|
||||
** HTML escaped.
|
||||
*/
|
||||
void output_html_string(const char *z) {
|
||||
if (z == 0)
|
||||
z = "";
|
||||
string escaped;
|
||||
for (; *z; z++) {
|
||||
switch (*z) {
|
||||
case '<':
|
||||
escaped += "<";
|
||||
break;
|
||||
case '&':
|
||||
escaped += "&";
|
||||
break;
|
||||
case '>':
|
||||
escaped += ">";
|
||||
break;
|
||||
case '\"':
|
||||
escaped += """;
|
||||
break;
|
||||
case '\'':
|
||||
escaped += "'";
|
||||
break;
|
||||
default:
|
||||
escaped += *z;
|
||||
}
|
||||
}
|
||||
state.Print(escaped);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeTclRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeTclRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(RowResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.OutputCString(col_names[i] ? col_names[i] : "");
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.OutputCString(data[i] ? data[i] : state.nullValue.c_str());
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeCsvRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeCsvRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void Render(RowResult &result) override {
|
||||
state.SetBinaryMode();
|
||||
RowRenderer::Render(result);
|
||||
state.SetTextMode();
|
||||
}
|
||||
void RenderHeader(RowResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
state.OutputCSV(col_names[i] ? col_names[i] : "", i < col_names.size() - 1);
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
state.OutputCSV(data[i], i < data.size() - 1);
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeAsciiRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeAsciiRenderer(ShellState &state) : RowRenderer(state) {
|
||||
col_sep = "\n";
|
||||
row_sep = "\n";
|
||||
}
|
||||
|
||||
void RenderHeader(RowResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.Print(col_names[i] ? col_names[i] : "");
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.PrintValue(data[i]);
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeQuoteRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeQuoteRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderHeader(RowResult &result) override {
|
||||
if (!show_header) {
|
||||
return;
|
||||
}
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(col_sep);
|
||||
}
|
||||
state.OutputQuotedString(col_names[i]);
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
auto &types = result.types;
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
if (i > 0)
|
||||
state.Print(col_sep);
|
||||
if ((data[i] == 0) || (!types.empty() && types[i] == SQLITE_NULL)) {
|
||||
state.Print("NULL");
|
||||
} else if (!types.empty() && (types[i] == SQLITE_TEXT || types[i] == SQLITE_BLOB)) {
|
||||
state.OutputQuotedString(data[i]);
|
||||
} else if (!types.empty() && (types[i] == SQLITE_INTEGER || types[i] == SQLITE_FLOAT)) {
|
||||
state.Print(data[i]);
|
||||
} else if (state.IsNumber(data[i], 0)) {
|
||||
state.Print(data[i]);
|
||||
} else {
|
||||
state.OutputQuotedString(data[i]);
|
||||
}
|
||||
}
|
||||
state.Print(row_sep);
|
||||
}
|
||||
};
|
||||
|
||||
class ModeJsonRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeJsonRenderer(ShellState &state, bool json_array) : RowRenderer(state), json_array(json_array) {
|
||||
}
|
||||
|
||||
void Render(RowResult &result) override {
|
||||
if (first_row) {
|
||||
if (json_array) {
|
||||
// wrap all JSON objects in an array
|
||||
state.Print("[");
|
||||
}
|
||||
state.Print("{");
|
||||
first_row = false;
|
||||
} else {
|
||||
if (json_array) {
|
||||
// wrap all JSON objects in an array
|
||||
state.Print(",");
|
||||
}
|
||||
state.Print("\n{");
|
||||
}
|
||||
RenderRow(result);
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
auto &types = result.types;
|
||||
auto &col_names = result.column_names;
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(",");
|
||||
}
|
||||
state.OutputJSONString(col_names[i], -1);
|
||||
state.Print(":");
|
||||
if ((data[i] == 0) || (!types.empty() && types[i] == SQLITE_NULL)) {
|
||||
state.Print("null");
|
||||
} else if (!types.empty() && types[i] == SQLITE_FLOAT) {
|
||||
if (strcmp(data[i], "inf") == 0) {
|
||||
state.Print("1e999");
|
||||
} else if (strcmp(data[i], "-inf") == 0) {
|
||||
state.Print("-1e999");
|
||||
} else if (strcmp(data[i], "nan") == 0) {
|
||||
state.Print("null");
|
||||
} else if (strcmp(data[i], "-nan") == 0) {
|
||||
state.Print("null");
|
||||
} else {
|
||||
state.Print(data[i]);
|
||||
}
|
||||
} else if (!types.empty() && types[i] == SQLITE_BLOB && result.pStmt) {
|
||||
const void *pBlob = sqlite3_column_blob(result.pStmt, i);
|
||||
int nBlob = sqlite3_column_bytes(result.pStmt, i);
|
||||
state.OutputJSONString((const char *)pBlob, nBlob);
|
||||
} else if (!types.empty() && types[i] == SQLITE_TEXT) {
|
||||
state.OutputJSONString(data[i], -1);
|
||||
} else {
|
||||
state.Print(data[i]);
|
||||
}
|
||||
}
|
||||
state.Print("}");
|
||||
}
|
||||
|
||||
void RenderFooter(RowResult &result) override {
|
||||
if (json_array) {
|
||||
state.Print("]\n");
|
||||
} else {
|
||||
state.Print("\n");
|
||||
}
|
||||
}
|
||||
|
||||
bool json_array;
|
||||
};
|
||||
|
||||
class ModeInsertRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeInsertRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
auto &types = result.types;
|
||||
auto &col_names = result.column_names;
|
||||
|
||||
state.Print("INSERT INTO ");
|
||||
state.Print(state.zDestTable);
|
||||
if (show_header) {
|
||||
state.Print("(");
|
||||
for (idx_t i = 0; i < col_names.size(); i++) {
|
||||
if (i > 0) {
|
||||
state.Print(",");
|
||||
}
|
||||
state.PrintOptionallyQuotedIdentifier(col_names[i]);
|
||||
}
|
||||
state.Print(")");
|
||||
}
|
||||
for (idx_t i = 0; i < data.size(); i++) {
|
||||
state.Print(i > 0 ? "," : " VALUES(");
|
||||
if ((data[i] == 0) || (!types.empty() && types[i] == SQLITE_NULL)) {
|
||||
state.Print("NULL");
|
||||
} else if (state.IsNumber(data[i], nullptr)) {
|
||||
state.Print(data[i]);
|
||||
} else if (state.ShellHasFlag(SHFLG_Newlines)) {
|
||||
state.OutputQuotedString(data[i]);
|
||||
} else {
|
||||
state.OutputQuotedEscapedString(data[i]);
|
||||
}
|
||||
}
|
||||
state.Print(");\n");
|
||||
}
|
||||
};
|
||||
|
||||
class ModeSemiRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModeSemiRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
/* .schema and .fullschema output */
|
||||
state.PrintSchemaLine(result.data[0], "\n");
|
||||
}
|
||||
};
|
||||
|
||||
class ModePrettyRenderer : public RowRenderer {
|
||||
public:
|
||||
explicit ModePrettyRenderer(ShellState &state) : RowRenderer(state) {
|
||||
}
|
||||
|
||||
void RenderRow(RowResult &result) override {
|
||||
auto &data = result.data;
|
||||
/* .schema and .fullschema with --indent */
|
||||
if (data.size() != 1) {
|
||||
throw std::runtime_error("row must have exactly one value for pretty rendering");
|
||||
}
|
||||
char *z;
|
||||
int j;
|
||||
int nParen = 0;
|
||||
char cEnd = 0;
|
||||
char c;
|
||||
int nLine = 0;
|
||||
if (!data[0]) {
|
||||
return;
|
||||
}
|
||||
if (sqlite3_strlike("CREATE VIEW%", data[0], 0) == 0 || sqlite3_strlike("CREATE TRIG%", data[0], 0) == 0) {
|
||||
state.Print(data[0]);
|
||||
state.Print(";\n");
|
||||
return;
|
||||
}
|
||||
z = sqlite3_mprintf("%s", data[0]);
|
||||
j = 0;
|
||||
idx_t i;
|
||||
for (i = 0; IsSpace(z[i]); i++) {
|
||||
}
|
||||
for (; (c = z[i]) != 0; i++) {
|
||||
if (IsSpace(c)) {
|
||||
if (z[j - 1] == '\r')
|
||||
z[j - 1] = '\n';
|
||||
if (IsSpace(z[j - 1]) || z[j - 1] == '(')
|
||||
continue;
|
||||
} else if ((c == '(' || c == ')') && j > 0 && IsSpace(z[j - 1])) {
|
||||
j--;
|
||||
}
|
||||
z[j++] = c;
|
||||
}
|
||||
while (j > 0 && IsSpace(z[j - 1])) {
|
||||
j--;
|
||||
}
|
||||
z[j] = 0;
|
||||
if (state.StringLength(z) >= 79) {
|
||||
for (i = j = 0; (c = z[i]) != 0; i++) { /* Copy from z[i] back to z[j] */
|
||||
if (c == cEnd) {
|
||||
cEnd = 0;
|
||||
} else if (c == '"' || c == '\'' || c == '`') {
|
||||
cEnd = c;
|
||||
} else if (c == '[') {
|
||||
cEnd = ']';
|
||||
} else if (c == '-' && z[i + 1] == '-') {
|
||||
cEnd = '\n';
|
||||
} else if (c == '(') {
|
||||
nParen++;
|
||||
} else if (c == ')') {
|
||||
nParen--;
|
||||
if (nLine > 0 && nParen == 0 && j > 0) {
|
||||
state.PrintSchemaLineN(z, j, "\n");
|
||||
j = 0;
|
||||
}
|
||||
}
|
||||
z[j++] = c;
|
||||
if (nParen == 1 && cEnd == 0 && (c == '(' || c == '\n' || (c == ',' && !wsToEol(z + i + 1)))) {
|
||||
if (c == '\n')
|
||||
j--;
|
||||
state.PrintSchemaLineN(z, j, "\n ");
|
||||
j = 0;
|
||||
nLine++;
|
||||
while (IsSpace(z[i + 1])) {
|
||||
i++;
|
||||
}
|
||||
}
|
||||
}
|
||||
z[j] = 0;
|
||||
}
|
||||
state.PrintSchemaLine(z, ";\n");
|
||||
sqlite3_free(z);
|
||||
}
|
||||
|
||||
/*
|
||||
** Return true if string z[] has nothing but whitespace and comments to the
|
||||
** end of the first line.
|
||||
*/
|
||||
static bool wsToEol(const char *z) {
|
||||
int i;
|
||||
for (i = 0; z[i]; i++) {
|
||||
if (z[i] == '\n') {
|
||||
return true;
|
||||
}
|
||||
if (IsSpace(z[i])) {
|
||||
continue;
|
||||
}
|
||||
if (z[i] == '-' && z[i + 1] == '-') {
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
};
|
||||
|
||||
unique_ptr<RowRenderer> ShellState::GetRowRenderer() {
|
||||
return GetRowRenderer(cMode);
|
||||
}
|
||||
|
||||
unique_ptr<RowRenderer> ShellState::GetRowRenderer(RenderMode mode) {
|
||||
switch (mode) {
|
||||
case RenderMode::LINE:
|
||||
return unique_ptr<RowRenderer>(new ModeLineRenderer(*this));
|
||||
case RenderMode::EXPLAIN:
|
||||
return unique_ptr<RowRenderer>(new ModeExplainRenderer(*this));
|
||||
case RenderMode::LIST:
|
||||
return unique_ptr<RowRenderer>(new ModeListRenderer(*this));
|
||||
case RenderMode::HTML:
|
||||
return unique_ptr<RowRenderer>(new ModeHtmlRenderer(*this));
|
||||
case RenderMode::TCL:
|
||||
return unique_ptr<RowRenderer>(new ModeTclRenderer(*this));
|
||||
case RenderMode::CSV:
|
||||
return unique_ptr<RowRenderer>(new ModeCsvRenderer(*this));
|
||||
case RenderMode::ASCII:
|
||||
return unique_ptr<RowRenderer>(new ModeAsciiRenderer(*this));
|
||||
case RenderMode::QUOTE:
|
||||
return unique_ptr<RowRenderer>(new ModeQuoteRenderer(*this));
|
||||
case RenderMode::JSON:
|
||||
return unique_ptr<RowRenderer>(new ModeJsonRenderer(*this, true));
|
||||
case RenderMode::JSONLINES:
|
||||
return unique_ptr<RowRenderer>(new ModeJsonRenderer(*this, false));
|
||||
case RenderMode::INSERT:
|
||||
return unique_ptr<RowRenderer>(new ModeInsertRenderer(*this));
|
||||
case RenderMode::SEMI:
|
||||
return unique_ptr<RowRenderer>(new ModeSemiRenderer(*this));
|
||||
case RenderMode::PRETTY:
|
||||
return unique_ptr<RowRenderer>(new ModePrettyRenderer(*this));
|
||||
default:
|
||||
throw std::runtime_error("Unsupported mode for GetRowRenderer");
|
||||
}
|
||||
}
|
||||
|
||||
} // namespace duckdb_shell
|
||||
180
external/duckdb/tools/shell/tests/conftest.py
vendored
Normal file
180
external/duckdb/tools/shell/tests/conftest.py
vendored
Normal file
@@ -0,0 +1,180 @@
|
||||
import pytest
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import List, NamedTuple, Union
|
||||
|
||||
|
||||
def pytest_addoption(parser):
|
||||
parser.addoption(
|
||||
"--shell-binary", action="store", default=None, help="Provide the shell binary to use for the tests"
|
||||
)
|
||||
parser.addoption("--start-offset", action="store", type=int, help="Skip the first 'n' tests")
|
||||
|
||||
|
||||
def pytest_collection_modifyitems(config, items):
|
||||
start_offset = config.getoption("--start-offset")
|
||||
if not start_offset:
|
||||
# --skiplist not given in cli, therefore move on
|
||||
return
|
||||
|
||||
skipped = pytest.mark.skip(reason="included in --skiplist")
|
||||
skipped_items = items[:start_offset]
|
||||
for item in skipped_items:
|
||||
item.add_marker(skipped)
|
||||
|
||||
|
||||
class TestResult:
|
||||
def __init__(self, stdout, stderr, status_code):
|
||||
self.stdout: Union[str, bytes] = stdout
|
||||
self.stderr: Union[str, bytes] = stderr
|
||||
self.status_code: int = status_code
|
||||
|
||||
def check_stdout(self, expected: Union[str, List[str], bytes]):
|
||||
if isinstance(expected, list):
|
||||
expected = '\n'.join(expected)
|
||||
assert self.status_code == 0
|
||||
assert expected in self.stdout
|
||||
|
||||
def check_not_exist(self, not_exist: Union[str, List[str], bytes]):
|
||||
if isinstance(not_exist, list):
|
||||
not_exist = '\n'.join(not_exist)
|
||||
assert self.status_code == 0
|
||||
assert not_exist not in self.stdout
|
||||
|
||||
def check_stderr(self, expected: str):
|
||||
assert expected in self.stderr
|
||||
|
||||
|
||||
class ShellTest:
|
||||
def __init__(self, shell, arguments=[]):
|
||||
if not shell:
|
||||
raise ValueError("Please provide a shell binary")
|
||||
self.shell = shell
|
||||
self.arguments = [shell, '--batch', '--init', '/dev/null'] + arguments
|
||||
self.statements: List[str] = []
|
||||
self.input = None
|
||||
self.output = None
|
||||
self.environment = {}
|
||||
|
||||
def add_argument(self, *args):
|
||||
self.arguments.extend(args)
|
||||
return self
|
||||
|
||||
def statement(self, stmt):
|
||||
self.statements.append(stmt)
|
||||
return self
|
||||
|
||||
def query(self, *stmts):
|
||||
self.statements.extend(stmts)
|
||||
return self
|
||||
|
||||
def input_file(self, file_path):
|
||||
self.input = file_path
|
||||
return self
|
||||
|
||||
def output_file(self, file_path):
|
||||
self.output = file_path
|
||||
return self
|
||||
|
||||
# Test Running methods
|
||||
|
||||
def get_command(self, cmd: str) -> List[str]:
|
||||
command = self.arguments
|
||||
if self.input:
|
||||
command += [cmd]
|
||||
return command
|
||||
|
||||
def get_input_data(self, cmd: str):
|
||||
if self.input:
|
||||
input_data = open(self.input, 'rb').read()
|
||||
else:
|
||||
input_data = bytearray(cmd, 'utf8')
|
||||
return input_data
|
||||
|
||||
def get_output_pipe(self):
|
||||
output_pipe = subprocess.PIPE
|
||||
if self.output:
|
||||
output_pipe = open(self.output, 'w+')
|
||||
return output_pipe
|
||||
|
||||
def get_statements(self):
|
||||
result = ""
|
||||
statements = []
|
||||
for statement in self.statements:
|
||||
if statement.startswith('.'):
|
||||
statements.append(statement)
|
||||
else:
|
||||
statements.append(statement + ';')
|
||||
return '\n'.join(statements)
|
||||
|
||||
def get_output_data(self, res):
|
||||
if self.output:
|
||||
stdout = open(self.output, 'r').read()
|
||||
else:
|
||||
stdout = res.stdout.decode('utf8').strip()
|
||||
stderr = res.stderr.decode('utf8').strip()
|
||||
return stdout, stderr
|
||||
|
||||
def run(self):
|
||||
statements = self.get_statements()
|
||||
command = self.get_command(statements)
|
||||
input_data = self.get_input_data(statements)
|
||||
output_pipe = self.get_output_pipe()
|
||||
|
||||
my_env = os.environ.copy()
|
||||
for key, val in self.environment.items():
|
||||
my_env[key] = val
|
||||
|
||||
res = subprocess.run(command, input=input_data, stdout=output_pipe, stderr=subprocess.PIPE, env=my_env)
|
||||
|
||||
stdout, stderr = self.get_output_data(res)
|
||||
return TestResult(stdout, stderr, res.returncode)
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def shell(request):
|
||||
custom_arg = request.config.getoption("--shell-binary")
|
||||
if not custom_arg:
|
||||
raise ValueError("Please provide a shell binary path to the tester, using '--shell-binary <path_to_cli>'")
|
||||
return custom_arg
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def random_filepath(request, tmp_path):
|
||||
tmp_file = tmp_path / "random_import_file"
|
||||
return tmp_file
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def generated_file(request, random_filepath):
|
||||
param = request.param
|
||||
tmp_file = random_filepath
|
||||
with open(tmp_file, 'w+') as f:
|
||||
f.write(param)
|
||||
return tmp_file
|
||||
|
||||
|
||||
def check_load_status(shell, extension: str):
|
||||
binary = ShellTest(shell)
|
||||
binary.statement(f"select loaded from duckdb_extensions() where extension_name = '{extension}';")
|
||||
result = binary.run()
|
||||
return result.stdout
|
||||
|
||||
|
||||
def assert_loaded(shell, extension: str):
|
||||
# TODO: add a command line argument to fail instead of skip if the extension is not loaded
|
||||
out = check_load_status(shell, extension)
|
||||
if 'true' not in out:
|
||||
pytest.skip(reason=f"'{extension}' extension is not loaded!")
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def autocomplete_extension(shell):
|
||||
assert_loaded(shell, 'autocomplete')
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def json_extension(shell):
|
||||
assert_loaded(shell, 'json')
|
||||
340
external/duckdb/tools/shell/tests/test_autocomplete.py
vendored
Normal file
340
external/duckdb/tools/shell/tests/test_autocomplete.py
vendored
Normal file
@@ -0,0 +1,340 @@
|
||||
# fmt: off
|
||||
|
||||
import pytest
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import List
|
||||
from conftest import ShellTest
|
||||
from conftest import autocomplete_extension
|
||||
import os
|
||||
|
||||
# 'autocomplete_extension' is a fixture which will skip the test if 'autocomplete' is not loaded
|
||||
def test_autocomplete_select(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CALL sql_auto_complete('SEL')")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('SELECT')
|
||||
|
||||
def test_autocomplete_first_from(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CALL sql_auto_complete('FRO')")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('FROM')
|
||||
|
||||
def test_autocomplete_column(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT my_') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('my_column')
|
||||
|
||||
def test_autocomplete_where(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT my_column FROM my_table WH') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('WHERE')
|
||||
|
||||
def test_autocomplete_insert(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('INS') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('INSERT')
|
||||
|
||||
def test_autocomplete_into(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('INSERT IN') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('INTO')
|
||||
|
||||
def test_autocomplete_into_table(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('INSERT INTO my_t') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('my_table')
|
||||
|
||||
def test_autocomplete_values(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('INSERT INTO my_table VAL') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('VALUES')
|
||||
|
||||
def test_autocomplete_delete(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('DEL') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('DELETE')
|
||||
|
||||
def test_autocomplete_delete_from(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('DELETE F') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('FROM')
|
||||
|
||||
def test_autocomplete_from_table(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('DELETE FROM m') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('my_table')
|
||||
|
||||
def test_autocomplete_update(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('UP') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('UPDATE')
|
||||
|
||||
def test_autocomplete_update_table(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('UPDATE m') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('my_table')
|
||||
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("""SELECT * FROM sql_auto_complete('UPDATE "m') LIMIT 1;""")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('my_table')
|
||||
|
||||
def test_autocomplete_update_column(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE my_table(my_column INTEGER);")
|
||||
.statement("SELECT * FROM sql_auto_complete('UPDATE my_table SET m') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('my_column')
|
||||
|
||||
def test_autocomplete_funky_table(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("""CREATE TABLE "Funky Table With Spaces"(my_column INTEGER);""")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT * FROM F') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('"Funky Table With Spaces"')
|
||||
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("""CREATE TABLE "Funky Table With Spaces"("Funky Column" int);""")
|
||||
.statement("""SELECT * FROM sql_auto_complete('select "Funky Column" FROM f') LIMIT 1;""")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('"Funky Table With Spaces"')
|
||||
|
||||
def test_autocomplete_funky_column(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("""CREATE TABLE "Funky Table With Spaces"("Funky Column" int);""")
|
||||
.statement("SELECT * FROM sql_auto_complete('select f') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('"Funky Column"')
|
||||
|
||||
def test_autocomplete_semicolon(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT 42; SEL') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('SELECT')
|
||||
|
||||
def test_autocomplete_comments(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("""
|
||||
SELECT * FROM sql_auto_complete('--SELECT * FROM
|
||||
SEL') LIMIT 1;""")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('SELECT')
|
||||
|
||||
def test_autocomplete_scalar_functions(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT regexp_m') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('regexp_matches')
|
||||
|
||||
def test_autocomplete_aggregates(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT approx_c') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('approx_count_distinct')
|
||||
|
||||
def test_autocomplete_builtin_views(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT * FROM sqlite_ma') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('sqlite_master')
|
||||
|
||||
def test_autocomplete_table_function(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT * FROM read_csv_a') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('read_csv_auto')
|
||||
|
||||
def test_autocomplete_tpch(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE partsupp(ps_suppkey int);")
|
||||
.statement("CREATE TABLE supplier(s_suppkey int);")
|
||||
.statement("CREATE TABLE nation(n_nationkey int);")
|
||||
.statement("SELECT * FROM sql_auto_complete('DROP TABLE na') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('nation')
|
||||
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE partsupp(ps_suppkey int);")
|
||||
.statement("CREATE TABLE supplier(s_suppkey int);")
|
||||
.statement("CREATE TABLE nation(n_nationkey int);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT s_supp') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('s_suppkey')
|
||||
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE partsupp(ps_suppkey int);")
|
||||
.statement("CREATE TABLE supplier(s_suppkey int);")
|
||||
.statement("CREATE TABLE nation(n_nationkey int);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT * FROM partsupp JOIN supp') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('supplier')
|
||||
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE partsupp(ps_suppkey int);")
|
||||
.statement("CREATE TABLE supplier(s_suppkey int);")
|
||||
.statement("CREATE TABLE nation(n_nationkey int);")
|
||||
.statement(".mode csv")
|
||||
.statement("SELECT l,l FROM sql_auto_complete('SELECT * FROM partsupp JOIN supplier ON (s_supp') t(l) LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('s_suppkey,s_suppkey')
|
||||
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE partsupp(ps_suppkey int);")
|
||||
.statement("CREATE TABLE supplier(s_suppkey int);")
|
||||
.statement("CREATE TABLE nation(n_nationkey int);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT * FROM partsupp JOIN supplier USING (ps_su') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('ps_suppkey')
|
||||
|
||||
def test_autocomplete_from(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT * FR') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('FROM')
|
||||
|
||||
def test_autocomplete_disambiguation_column(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE MyTable(MyColumn Varchar);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT My') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('MyColumn')
|
||||
|
||||
def test_autocomplete_disambiguation_table(shell, autocomplete_extension):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE MyTable(MyColumn Varchar);")
|
||||
.statement("SELECT * FROM sql_auto_complete('SELECT MyColumn FROM My') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('MyTable')
|
||||
|
||||
def test_autocomplete_directory(shell, autocomplete_extension, tmp_path):
|
||||
shell_test_dir = tmp_path / 'shell_test_dir'
|
||||
extra_path = tmp_path / 'shell_test_dir' / 'extra_path'
|
||||
shell_test_dir.mkdir()
|
||||
extra_path.mkdir()
|
||||
|
||||
# Create the files
|
||||
base_files = ['extra.parquet', 'extra.file']
|
||||
for fname in base_files:
|
||||
with open(shell_test_dir / fname, 'w+') as f:
|
||||
f.write('')
|
||||
|
||||
# Complete the directory
|
||||
partial_directory = tmp_path / 'shell_test'
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE MyTable(MyColumn Varchar);")
|
||||
.statement(f"SELECT * FROM sql_auto_complete('SELECT * FROM ''{partial_directory.as_posix()}') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout("shell_test_dir")
|
||||
|
||||
# Complete the sub directory as well
|
||||
partial_subdirectory = tmp_path / 'shell_test_dir' / 'extra'
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE MyTable(MyColumn Varchar);")
|
||||
.statement(f"SELECT * FROM sql_auto_complete('SELECT * FROM ''{partial_subdirectory.as_posix()}') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout("extra_path")
|
||||
|
||||
# Complete the parquet file in the sub directory
|
||||
partial_parquet = tmp_path / 'shell_test_dir' / 'extra.par'
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("CREATE TABLE MyTable(MyColumn Varchar);")
|
||||
.statement(f"SELECT * FROM sql_auto_complete('SELECT * FROM ''{partial_parquet.as_posix()}') LIMIT 1;")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout("extra.parquet")
|
||||
|
||||
# fmt: on
|
||||
58
external/duckdb/tools/shell/tests/test_backwards_compatibility.py
vendored
Normal file
58
external/duckdb/tools/shell/tests/test_backwards_compatibility.py
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
# fmt: off
|
||||
|
||||
import pytest
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import List
|
||||
from conftest import ShellTest
|
||||
import os
|
||||
|
||||
def test_version_dev(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".open test/storage/bc/db_dev.db")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("older development version")
|
||||
|
||||
def test_version_0_3_1(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".open test/storage/bc/db_031.db")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("v0.3.1")
|
||||
|
||||
def test_version_0_3_2(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".open test/storage/bc/db_032.db")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("v0.3.2")
|
||||
|
||||
def test_version_0_4(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".open test/storage/bc/db_04.db")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("v0.4.0")
|
||||
|
||||
def test_version_0_5_1(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".open test/storage/bc/db_051.db")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("v0.5.1")
|
||||
|
||||
def test_version_0_6_0(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".open test/storage/bc/db_060.db")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("v0.6.0")
|
||||
|
||||
# fmt: on
|
||||
95
external/duckdb/tools/shell/tests/test_errors.py
vendored
Normal file
95
external/duckdb/tools/shell/tests/test_errors.py
vendored
Normal file
@@ -0,0 +1,95 @@
|
||||
# fmt: off
|
||||
|
||||
import pytest
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import List
|
||||
from conftest import ShellTest
|
||||
import os
|
||||
|
||||
lineitem_ddl = 'CREATE TABLE lineitem(l_orderkey BIGINT NOT NULL, l_partkey BIGINT NOT NULL, l_suppkey BIGINT NOT NULL, l_linenumber BIGINT NOT NULL, l_quantity DECIMAL(15,2) NOT NULL, l_extendedprice DECIMAL(15,2) NOT NULL, l_discount DECIMAL(15,2) NOT NULL, l_tax DECIMAL(15,2) NOT NULL, l_returnflag VARCHAR NOT NULL, l_linestatus VARCHAR NOT NULL, l_shipdate DATE NOT NULL, l_commitdate DATE NOT NULL, l_receiptdate DATE NOT NULL, l_shipinstruct VARCHAR NOT NULL, l_shipmode VARCHAR NOT NULL, l_comment VARCHAR NOT NULL);'
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_incorrect_column(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_errors on")
|
||||
.statement(lineitem_ddl)
|
||||
.statement('select * from lineitem where l_extendedpric=5;')
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr('"\x1b[33ml_extendedprice')
|
||||
result.check_stderr('"\x1b[33ml_extendedpric\x1b[0m')
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_missing_table(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_errors on")
|
||||
.statement(lineitem_ddl)
|
||||
.statement('select * from lineite where l_extendedprice=5;')
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr('"\x1b[33mlineitem\x1b[0m')
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_long_error(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_errors on")
|
||||
.statement(lineitem_ddl)
|
||||
.statement('''SELECT
|
||||
l_returnflag,
|
||||
l_linestatus,
|
||||
sum(l_quantity) AS sum_qty,
|
||||
sum(l_extendedprice) AS sum_base_price,
|
||||
sum(l_extendedprice * (1 - l_discount)) AS sum_disc_price,
|
||||
sum(l_extendedprice * (1 - l_discount) * (1 + l_tax)) AS sum_charge,
|
||||
avg(l_quantity) AS avg_qty,
|
||||
avg(l_extendedprice) AS avg_price,
|
||||
avg(l_discount) AS avg_disc,
|
||||
count(*) AS count_order
|
||||
FROM
|
||||
lineitem
|
||||
WHERE
|
||||
l_shipdate <= CAST('1998-09-02' AS date) + timestamp '2020-01-01'
|
||||
GROUP BY
|
||||
l_returnflag,
|
||||
l_linestatus
|
||||
ORDER BY
|
||||
l_returnflag,
|
||||
l_linestatus;''')
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr('\x1b[33m+(DATE, TIMESTAMP)\x1b[0m')
|
||||
result.check_stderr('\x1b[32mCAST\x1b[0m')
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_single_quotes_in_error(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_errors on")
|
||||
.statement("select \"I'm an error\"")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr('"\x1b[33mI\'m an error\x1b[0m')
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_double_quotes_in_error(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_errors on")
|
||||
.statement("select error('''I\"m an error''')")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr('\x1b[33mI"m an error\x1b[0m')
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_unterminated_quote(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_errors on")
|
||||
.statement("select error('I''m an error')")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr('I\'m an error')
|
||||
15
external/duckdb/tools/shell/tests/test_explain.py
vendored
Normal file
15
external/duckdb/tools/shell/tests/test_explain.py
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
# fmt: off
|
||||
|
||||
import pytest
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import List
|
||||
from conftest import ShellTest
|
||||
import os
|
||||
|
||||
def test_invalid_explain(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement("EXPLAIN SELECT 'any_string' IN ?;")
|
||||
)
|
||||
result = test.run()
|
||||
30
external/duckdb/tools/shell/tests/test_get_env.py
vendored
Normal file
30
external/duckdb/tools/shell/tests/test_get_env.py
vendored
Normal file
@@ -0,0 +1,30 @@
|
||||
# fmt: off
|
||||
|
||||
from conftest import ShellTest
|
||||
|
||||
def test_get_env(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement('.null NULL')
|
||||
.statement("SET default_null_order=getenv('DEFAULT_NULL_ORDER');")
|
||||
.statement("SELECT * FROM (VALUES (42), (NULL)) ORDER BY 1 LIMIT 1;")
|
||||
)
|
||||
test.environment['DEFAULT_NULL_ORDER'] = 'NULLS_FIRST'
|
||||
result = test.run()
|
||||
result.check_stdout('NULL')
|
||||
|
||||
test.environment['DEFAULT_NULL_ORDER'] = 'NULLS_LAST'
|
||||
result = test.run()
|
||||
result.check_stdout('42')
|
||||
|
||||
def test_get_env_permissions(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement('SET enable_external_access=false')
|
||||
.statement("SELECT getenv('DEFAULT_NULL_ORDER');")
|
||||
)
|
||||
test.environment['DEFAULT_NULL_ORDER'] = 'NULLS_FIRST'
|
||||
result = test.run()
|
||||
result.check_stderr('disabled through configuration')
|
||||
|
||||
# fmt: on
|
||||
49
external/duckdb/tools/shell/tests/test_highlighting.py
vendored
Normal file
49
external/duckdb/tools/shell/tests/test_highlighting.py
vendored
Normal file
@@ -0,0 +1,49 @@
|
||||
# fmt: off
|
||||
|
||||
import pytest
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import List
|
||||
from conftest import ShellTest
|
||||
import os
|
||||
|
||||
lineitem_ddl = 'CREATE TABLE lineitem(l_orderkey BIGINT NOT NULL, l_partkey BIGINT NOT NULL, l_suppkey BIGINT NOT NULL, l_linenumber BIGINT NOT NULL, l_quantity DECIMAL(15,2) NOT NULL, l_extendedprice DECIMAL(15,2) NOT NULL, l_discount DECIMAL(15,2) NOT NULL, l_tax DECIMAL(15,2) NOT NULL, l_returnflag VARCHAR NOT NULL, l_linestatus VARCHAR NOT NULL, l_shipdate DATE NOT NULL, l_commitdate DATE NOT NULL, l_receiptdate DATE NOT NULL, l_shipinstruct VARCHAR NOT NULL, l_shipmode VARCHAR NOT NULL, l_comment VARCHAR NOT NULL);'
|
||||
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_highlight_column_header(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_results on")
|
||||
.statement('select NULL AS r;')
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('\x1b[90mNULL\x1b[0m')
|
||||
@pytest.mark.skipif(os.name == 'nt', reason="Windows highlighting does not use shell escapes")
|
||||
def test_custom_highlight(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_results on")
|
||||
.statement(".highlight_colors column_name red bold")
|
||||
.statement(".highlight_colors column_type yellow")
|
||||
.statement(lineitem_ddl)
|
||||
.statement('select * from lineitem;')
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stdout('\x1b[1m\x1b[31ml_comment\x1b[0m')
|
||||
result.check_stdout('\x1b[33mvarchar\x1b[0m')
|
||||
|
||||
def test_custom_highlight_error(shell):
|
||||
test = (
|
||||
ShellTest(shell)
|
||||
.statement(".highlight_colors column_nameXX red")
|
||||
.statement(".highlight_colors column_name redXX")
|
||||
.statement(".highlight_colors column_name red boldXX")
|
||||
.statement(".highlight_colors column_name red bold zz")
|
||||
)
|
||||
result = test.run()
|
||||
result.check_stderr("Unknown element 'column_nameXX'")
|
||||
result.check_stderr("Unknown color 'redXX'")
|
||||
result.check_stderr("Unknown intensity 'boldXX'")
|
||||
result.check_stderr("Usage: .highlight_colors")
|
||||
|
||||
# fmt: on
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user