mirror of https://github.com/AMT-Cheif/drift.git
Merge branch 'develop' into beta
This commit is contained in:
commit
75f432c5f4
|
@ -4,7 +4,9 @@ Currently, we support
|
|||
|
||||
- showing errors in moor files
|
||||
- outline
|
||||
- (kinda) folding
|
||||
- folding
|
||||
- (very, very limited) autocomplete
|
||||
- some quickfixes to make columns nullable or non-null
|
||||
|
||||
## Setup
|
||||
To use this plugin, you'll need to perform these steps once. It is assumed that you
|
||||
|
|
|
@ -1,3 +1,3 @@
|
|||
## 0.0.1
|
||||
|
||||
* TODO: Describe initial release.
|
||||
- Initial release. Contains standalone bindings and a moor implementation.
|
|
@ -1,28 +1,27 @@
|
|||
# moor_ffi
|
||||
|
||||
Experimental bindings to sqlite by using `dart:ffi`. This library contains utils to make
|
||||
Experimental Dart bindings to sqlite by using `dart:ffi`. This library contains utils to make
|
||||
integration with [moor](https://pub.dev/packages/moor) easier, but it can also be used
|
||||
as a standalone package.
|
||||
as a standalone package. It also doesn't depend on Flutter, so it can be used on Dart VM
|
||||
applications as well.
|
||||
|
||||
## Warnings
|
||||
At the moment, `dart:ffi` is in preview and there will be breaking changes that this
|
||||
library has to adapt to. This library has been tested on Dart `2.5.0`.
|
||||
If you're using a development Dart version (this can include any Flutter channels that
|
||||
are not `stable`), this library might not work.
|
||||
|
||||
If you're using a development Dart version (this includes Flutter channels that are not
|
||||
`stable`), this library might not work.
|
||||
|
||||
If you just want to use moor, using the [moor_flutter](https://pub.dev/packages/moor_flutter)
|
||||
package is the better option at the moment.
|
||||
Also, please don't use this library on production apps yet.
|
||||
|
||||
## Supported platforms
|
||||
You can make this library work on any platform that let's you obtain a `DynamicLibrary`
|
||||
from which moor_ffi loads the functions (see below).
|
||||
You can make this library work on any platform that lets you obtain a `DynamicLibrary`
|
||||
in which sqlite's symbols are available (see below).
|
||||
|
||||
Out of the box, this libraries supports all platforms where `sqlite3` is installed:
|
||||
Out of the box, this library supports all platforms where `sqlite3` is installed:
|
||||
- iOS: Yes
|
||||
- macOS: Yes
|
||||
- Linux: Available on most distros
|
||||
- Windows: When the user has installed sqlite (they probably have)
|
||||
- Windows: Additional setup is required
|
||||
- Android: Yes when used with Flutter
|
||||
|
||||
This library works with and without Flutter.
|
||||
|
@ -33,7 +32,7 @@ we need to compile sqlite.
|
|||
|
||||
### On other platforms
|
||||
Using this library on platforms that are not supported out of the box is fairly
|
||||
straightforward. For instance, if you release your own `sqlite3.so` with your application,
|
||||
straightforward. For instance, if you release your own `sqlite3.so` next to your application,
|
||||
you could use
|
||||
```dart
|
||||
import 'dart:ffi';
|
||||
|
@ -54,14 +53,14 @@ DynamicLibrary _openOnLinux() {
|
|||
return DynamicLibrary.open(libraryNextToScript.path);
|
||||
}
|
||||
```
|
||||
Just be sure to first override the behavior and then opening the database. Further,
|
||||
if you want to use the isolate api, you can only use a static method or top-level
|
||||
function to open the library.
|
||||
Just be sure to first override the behavior and then open the database. Further,
|
||||
if you want to use the isolate api, you can only use a static method or a top-level
|
||||
function to open the library. For Windows, a similar setup with a `sqlite3.dll` library
|
||||
should work.
|
||||
|
||||
### Supported datatypes
|
||||
This library supports `null`, `int`, other `num`s (converted to double),
|
||||
`String` and `Uint8List` to bind args. Returned columns from select statements
|
||||
will have the same types.
|
||||
This library supports `null`, `int`, `double`, `String` and `Uint8List` to bind args.
|
||||
Returned columns from select statements will have the same types.
|
||||
|
||||
## Using without moor
|
||||
```dart
|
||||
|
@ -75,15 +74,12 @@ void main() {
|
|||
```
|
||||
|
||||
You can also use an asynchronous API on a background isolate by using `IsolateDb.openFile`
|
||||
or `IsolateDb.openMemory`, respectively. be aware that the asynchronous API is much slower,
|
||||
or `IsolateDb.openMemory`, respectively. Be aware that the asynchronous API is much slower,
|
||||
but it moves work out of the UI isolate.
|
||||
|
||||
Be sure to __always__ call `Database.close` to avoid memory leaks!
|
||||
|
||||
## Migrating from moor_flutter
|
||||
__Note__: For production apps, please use `moor_flutter` until this package
|
||||
reaches a stable version.
|
||||
|
||||
## Using with moor
|
||||
Add both `moor` and `moor_ffi` to your pubspec, the `moor_flutter` dependency can be dropped.
|
||||
|
||||
```yaml
|
||||
|
@ -100,5 +96,5 @@ In all other project files that use moor apis (e.g. a `Value` class for companio
|
|||
|
||||
Finally, replace usages of `FlutterQueryExecutor` with `VmDatabase`.
|
||||
|
||||
Note that, at the moment, there is no counterpart for `FlutterQueryExecutor.inDatabasePath` and that the async API using
|
||||
a background isolate is not available yet. Both shortcomings with be fixed by the upcoming moor 2.0 release.
|
||||
Note that, at the moment, there is no direct counterpart for `FlutterQueryExecutor.inDatabasePath`
|
||||
and that the async API using a background isolate is not available for moor yet.
|
|
@ -11,11 +11,12 @@ class CBlob extends Struct<CBlob> {
|
|||
int data;
|
||||
|
||||
static Pointer<CBlob> allocate(Uint8List blob) {
|
||||
final str = Pointer<CBlob>.allocate(count: blob.length);
|
||||
for (var i = 0; i < blob.length; i++) {
|
||||
str.elementAt(i).load<CBlob>().data = blob[i];
|
||||
}
|
||||
return str;
|
||||
final str = Pointer<Uint8>.allocate(count: blob.length);
|
||||
|
||||
final asList = str.asExternalTypedData(count: blob.length) as Uint8List;
|
||||
asList.setAll(0, blob);
|
||||
|
||||
return str.cast();
|
||||
}
|
||||
|
||||
/// Allocates a 0-terminated string, encoded as utf8 and read from the
|
||||
|
@ -30,18 +31,25 @@ class CBlob extends Struct<CBlob> {
|
|||
/// Reads [bytesToRead] bytes from the current position.
|
||||
Uint8List read(int bytesToRead) {
|
||||
assert(bytesToRead >= 0);
|
||||
final str = addressOf;
|
||||
final str = addressOf.cast<Uint8>();
|
||||
if (isNullPointer(str)) return null;
|
||||
|
||||
// todo can we user Pointer.asExternalTypedData here?
|
||||
final blob = Uint8List(bytesToRead);
|
||||
for (var i = 0; i < bytesToRead; ++i) {
|
||||
blob[i] = str.elementAt(i).load<CBlob>().data;
|
||||
final data = str.asExternalTypedData(count: bytesToRead) as Uint8List;
|
||||
return Uint8List.fromList(data);
|
||||
}
|
||||
return blob;
|
||||
|
||||
/// More efficient version of [readString] that doesn't have to find a nil-
|
||||
/// terminator. [length] is the amount of bytes to read. The string will be
|
||||
/// decoded via [utf8].
|
||||
String readAsStringWithLength(int length) {
|
||||
return utf8.decode(read(length));
|
||||
}
|
||||
|
||||
/// Reads a 0-terminated string, decoded with utf8.
|
||||
///
|
||||
/// Warning: This method is very, very slow. If there is any way to know the
|
||||
/// length of the string to read, [readAsStringWithLength] will be orders of
|
||||
/// magnitude faster.
|
||||
String readString() {
|
||||
final str = addressOf;
|
||||
if (isNullPointer(str)) return null;
|
||||
|
|
|
@ -59,10 +59,11 @@ class PreparedStatement implements BasePreparedStatement {
|
|||
case Types.SQLITE_FLOAT:
|
||||
return bindings.sqlite3_column_double(_stmt, index);
|
||||
case Types.SQLITE_TEXT:
|
||||
final length = bindings.sqlite3_column_bytes(_stmt, index);
|
||||
return bindings
|
||||
.sqlite3_column_text(_stmt, index)
|
||||
.load<CBlob>()
|
||||
.readString();
|
||||
.readAsStringWithLength(length);
|
||||
case Types.SQLITE_BLOB:
|
||||
final length = bindings.sqlite3_column_bytes(_stmt, index);
|
||||
return bindings
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import 'package:test/test.dart';
|
||||
|
||||
import '../runners.dart';
|
||||
import '../ffi_test.dart';
|
||||
|
||||
void main(TestedDatabase db) {
|
||||
test('insert statements report their id', () async {
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import 'package:test/test.dart';
|
||||
|
||||
import '../runners.dart';
|
||||
import '../ffi_test.dart';
|
||||
|
||||
void main(TestedDatabase db) {
|
||||
test('prepared statements can be used multiple times', () async {
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import 'package:test/test.dart';
|
||||
|
||||
import '../runners.dart';
|
||||
import '../ffi_test.dart';
|
||||
|
||||
void main(TestedDatabase db) {
|
||||
test('select statements return expected value', () async {
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import 'package:test/test.dart';
|
||||
|
||||
import '../runners.dart';
|
||||
import '../ffi_test.dart';
|
||||
|
||||
void main(TestedDatabase db) {
|
||||
test('can set the user version on a database', () async {
|
||||
|
|
|
@ -80,9 +80,15 @@ class MoorDriver implements AnalysisDriverGeneric {
|
|||
}
|
||||
final backendTask = _createTask(mostImportantFile.file.uri);
|
||||
|
||||
try {
|
||||
final task = session.startTask(backendTask);
|
||||
await task.runTask();
|
||||
_tracker.handleTaskCompleted(task);
|
||||
} catch (e, s) {
|
||||
Logger.root.warning(
|
||||
'Error while working on ${mostImportantFile.file.uri}', e, s);
|
||||
_tracker.removePending(mostImportantFile);
|
||||
}
|
||||
} finally {
|
||||
_isWorking = false;
|
||||
}
|
||||
|
@ -141,11 +147,16 @@ class MoorDriver implements AnalysisDriverGeneric {
|
|||
|
||||
/// Waits for the file at [path] to be parsed.
|
||||
Future<FoundFile> waitFileParsed(String path) {
|
||||
_scheduler.notify(this);
|
||||
final found = pathToFoundFile(path);
|
||||
|
||||
if (found.isParsed) {
|
||||
return Future.value(found);
|
||||
} else {
|
||||
_scheduler.notify(this);
|
||||
final found = pathToFoundFile(path);
|
||||
|
||||
return completedFiles()
|
||||
.firstWhere((file) => file == found && file.isParsed);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -76,6 +76,13 @@ class FileTracker {
|
|||
}
|
||||
}
|
||||
|
||||
/// Manually remove the [file] from the backlog. As the plugin is still very
|
||||
/// unstable, we use this on unexpected errors so that we just move on to the
|
||||
/// next file if there is a problem.
|
||||
void removePending(TrackedFile file) {
|
||||
_pendingWork.remove(file);
|
||||
}
|
||||
|
||||
void dispose() {
|
||||
_computations.close();
|
||||
}
|
||||
|
|
|
@ -37,6 +37,12 @@ abstract class _AssistOnNodeContributor<T extends AstNode> {
|
|||
const _AssistOnNodeContributor();
|
||||
|
||||
void contribute(AssistCollector collector, T node, String path);
|
||||
|
||||
SourceEdit replaceNode(AstNode node, String text) {
|
||||
final start = node.firstPosition;
|
||||
final length = node.lastPosition - start;
|
||||
return SourceEdit(start, length, text);
|
||||
}
|
||||
}
|
||||
|
||||
class AssistId {
|
||||
|
|
|
@ -9,9 +9,9 @@ class ColumnNullability extends _AssistOnNodeContributor<ColumnDefinition> {
|
|||
final notNull = node.findConstraint<NotNull>();
|
||||
|
||||
if (notNull == null) {
|
||||
// there is no not-null constraint on this column, suggest to add one at
|
||||
// the end of the definition
|
||||
final end = node.lastPosition;
|
||||
// there is no not-null constraint on this column, suggest to add one
|
||||
// after the type name
|
||||
final end = node.typeNames.last.span.end.offset;
|
||||
final id = AssistId.makeNotNull;
|
||||
|
||||
collector.addAssist(PrioritizedSourceChange(
|
||||
|
@ -34,9 +34,7 @@ class ColumnNullability extends _AssistOnNodeContributor<ColumnDefinition> {
|
|||
collector.addAssist(PrioritizedSourceChange(
|
||||
id.priority,
|
||||
SourceChange('Make this column nullable', id: id.id, edits: [
|
||||
SourceFileEdit(path, -1, edits: [
|
||||
SourceEdit(notNull.firstPosition, notNull.lastPosition, '')
|
||||
])
|
||||
SourceFileEdit(path, -1, edits: [replaceNode(notNull, '')])
|
||||
]),
|
||||
));
|
||||
}
|
||||
|
|
|
@ -6,6 +6,9 @@ class ColumnDefinition extends AstNode {
|
|||
final String typeName;
|
||||
final List<ColumnConstraint> constraints;
|
||||
|
||||
/// The tokens there were involved in defining the type of this column.
|
||||
List<Token> typeNames;
|
||||
|
||||
ColumnDefinition(
|
||||
{@required this.columnName,
|
||||
@required this.typeName,
|
||||
|
|
|
@ -21,11 +21,19 @@ class AutoCompleteEngine {
|
|||
final List<Hint> _hints = [];
|
||||
UnmodifiableListView<Hint> _hintsView;
|
||||
|
||||
final List<Token> _tokens;
|
||||
|
||||
void addHint(Hint hint) {
|
||||
_hints.insert(_lastHintBefore(hint.offset), hint);
|
||||
if (_hints.isEmpty) {
|
||||
_hints.add(hint);
|
||||
} else {
|
||||
// ensure that the hints list stays sorted by offset
|
||||
final position = _lastHintBefore(hint.offset);
|
||||
_hints.insert(position + 1, hint);
|
||||
}
|
||||
}
|
||||
|
||||
AutoCompleteEngine() {
|
||||
AutoCompleteEngine(this._tokens) {
|
||||
_hintsView = UnmodifiableListView(_hints);
|
||||
}
|
||||
|
||||
|
@ -40,13 +48,23 @@ class AutoCompleteEngine {
|
|||
final hint = foundHints[_lastHintBefore(offset)];
|
||||
|
||||
final suggestions = hint.description.suggest(CalculationRequest()).toList();
|
||||
return ComputedSuggestions(hint.offset, offset - hint.offset, suggestions);
|
||||
|
||||
// when calculating the offset, respect whitespace that comes after the
|
||||
// last hint.
|
||||
final lastToken = hint.before;
|
||||
final nextToken =
|
||||
lastToken != null ? _tokens[lastToken.index + 1] : _tokens.first;
|
||||
|
||||
final hintOffset = nextToken.span.start.offset;
|
||||
final length = offset - hintOffset;
|
||||
|
||||
return ComputedSuggestions(hintOffset, length, suggestions);
|
||||
}
|
||||
|
||||
/// find the last hint that appears before [offset]
|
||||
int _lastHintBefore(int offset) {
|
||||
// find the last hint that appears before offset
|
||||
var min = 0;
|
||||
var max = foundHints.length;
|
||||
var max = foundHints.length - 1;
|
||||
|
||||
while (min < max) {
|
||||
final mid = min + ((max - min) >> 1);
|
||||
|
@ -56,12 +74,20 @@ class AutoCompleteEngine {
|
|||
|
||||
if (offsetOfMid == offset) {
|
||||
return mid;
|
||||
} else if (offsetOfMid < offset) {
|
||||
} else {
|
||||
final offsetOfNext = _hints[mid + 1].offset;
|
||||
|
||||
if (offsetOfMid < offset) {
|
||||
if (offsetOfNext > offset) {
|
||||
// next one is too late, so this must be the correct one
|
||||
return mid;
|
||||
}
|
||||
min = mid + 1;
|
||||
} else {
|
||||
max = mid - 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return min;
|
||||
}
|
||||
|
|
|
@ -66,8 +66,9 @@ class SqlEngine {
|
|||
ParseResult parseMoorFile(String content) {
|
||||
assert(useMoorExtensions);
|
||||
|
||||
final autoComplete = AutoCompleteEngine();
|
||||
final tokens = tokenize(content);
|
||||
final autoComplete = AutoCompleteEngine(tokens);
|
||||
|
||||
final tokensForParser = tokens.where((t) => !t.invisibleToParser).toList();
|
||||
final parser =
|
||||
Parser(tokensForParser, useMoor: true, autoComplete: autoComplete);
|
||||
|
|
|
@ -80,7 +80,14 @@ mixin SchemaParser on ParserBase {
|
|||
final name = _consume(TokenType.identifier, 'Expected a column name')
|
||||
as IdentifierToken;
|
||||
|
||||
final typeName = _typeName();
|
||||
final typeTokens = _typeName();
|
||||
String typeName;
|
||||
|
||||
if (typeTokens != null) {
|
||||
final typeSpan = typeTokens.first.span.expand(typeTokens.last.span);
|
||||
typeName = typeSpan.text;
|
||||
}
|
||||
|
||||
final constraints = <ColumnConstraint>[];
|
||||
ColumnConstraint constraint;
|
||||
while ((constraint = _columnConstraint(orNull: true)) != null) {
|
||||
|
@ -91,19 +98,21 @@ mixin SchemaParser on ParserBase {
|
|||
columnName: name.identifier,
|
||||
typeName: typeName,
|
||||
constraints: constraints,
|
||||
)..setSpan(name, _previous);
|
||||
)
|
||||
..setSpan(name, _previous)
|
||||
..typeNames = typeTokens;
|
||||
}
|
||||
|
||||
String _typeName() {
|
||||
List<Token> _typeName() {
|
||||
// sqlite doesn't really define what a type name is and has very loose rules
|
||||
// at turning them into a type affinity. We support this pattern:
|
||||
// typename = identifier [ "(" { identifier | comma | number_literal } ")" ]
|
||||
if (!_matchOne(TokenType.identifier)) return null;
|
||||
|
||||
final typeNameBuilder = StringBuffer(_previous.lexeme);
|
||||
final typeNames = [_previous];
|
||||
|
||||
if (_matchOne(TokenType.leftParen)) {
|
||||
typeNameBuilder.write('(');
|
||||
typeNames.add(_previous);
|
||||
|
||||
const inBrackets = [
|
||||
TokenType.identifier,
|
||||
|
@ -111,14 +120,15 @@ mixin SchemaParser on ParserBase {
|
|||
TokenType.numberLiteral
|
||||
];
|
||||
while (_match(inBrackets)) {
|
||||
typeNameBuilder..write(' ')..write(_previous.lexeme);
|
||||
typeNames.add(_previous);
|
||||
}
|
||||
|
||||
_consume(TokenType.rightParen,
|
||||
'Expected closing paranthesis to finish type name');
|
||||
typeNames.add(_previous);
|
||||
}
|
||||
|
||||
return typeNameBuilder.toString();
|
||||
return typeNames;
|
||||
}
|
||||
|
||||
ColumnConstraint _columnConstraint({bool orNull = false}) {
|
||||
|
@ -127,10 +137,13 @@ mixin SchemaParser on ParserBase {
|
|||
final resolvedName = _constraintNameOrNull();
|
||||
|
||||
if (_matchOne(TokenType.primary)) {
|
||||
_suggestHint(HintDescription.token(TokenType.key));
|
||||
_consume(TokenType.key, 'Expected KEY to complete PRIMARY KEY clause');
|
||||
|
||||
final mode = _orderingModeOrNull();
|
||||
final conflict = _conflictClauseOrNull();
|
||||
|
||||
_suggestHint(HintDescription.token(TokenType.autoincrement));
|
||||
final hasAutoInc = _matchOne(TokenType.autoincrement);
|
||||
|
||||
return PrimaryKeyColumn(resolvedName,
|
||||
|
@ -138,6 +151,8 @@ mixin SchemaParser on ParserBase {
|
|||
..setSpan(first, _previous);
|
||||
}
|
||||
if (_matchOne(TokenType.not)) {
|
||||
_suggestHint(HintDescription.token(TokenType.$null));
|
||||
|
||||
final notToken = _previous;
|
||||
final nullToken =
|
||||
_consume(TokenType.$null, 'Expected NULL to complete NOT NULL');
|
||||
|
@ -249,6 +264,7 @@ mixin SchemaParser on ParserBase {
|
|||
}
|
||||
|
||||
ConflictClause _conflictClauseOrNull() {
|
||||
_suggestHint(HintDescription.token(TokenType.on));
|
||||
if (_matchOne(TokenType.on)) {
|
||||
_consume(TokenType.conflict,
|
||||
'Expected CONFLICT to complete ON CONFLICT clause');
|
||||
|
@ -260,6 +276,7 @@ mixin SchemaParser on ParserBase {
|
|||
TokenType.ignore: ConflictClause.ignore,
|
||||
TokenType.replace: ConflictClause.replace,
|
||||
};
|
||||
_suggestHint(HintDescription.tokens(modes.keys.toList()));
|
||||
|
||||
if (_match(modes.keys)) {
|
||||
return modes[_previous.type];
|
||||
|
@ -284,7 +301,10 @@ mixin SchemaParser on ParserBase {
|
|||
|
||||
ReferenceAction onDelete, onUpdate;
|
||||
|
||||
_suggestHint(HintDescription.token(TokenType.on));
|
||||
while (_matchOne(TokenType.on)) {
|
||||
_suggestHint(
|
||||
const HintDescription.tokens([TokenType.delete, TokenType.update]));
|
||||
if (_matchOne(TokenType.delete)) {
|
||||
onDelete = _referenceAction();
|
||||
} else if (_matchOne(TokenType.update)) {
|
||||
|
|
|
@ -35,6 +35,11 @@ class Scanner {
|
|||
|
||||
final endSpan = _file.span(source.length);
|
||||
tokens.add(Token(TokenType.eof, endSpan));
|
||||
|
||||
for (var i = 0; i < tokens.length; i++) {
|
||||
tokens[i].index = i;
|
||||
}
|
||||
|
||||
return tokens;
|
||||
}
|
||||
|
||||
|
|
|
@ -260,7 +260,10 @@ class Token {
|
|||
final FileSpan span;
|
||||
String get lexeme => span.text;
|
||||
|
||||
const Token(this.type, this.span);
|
||||
/// The index of this [Token] in the list of tokens scanned.
|
||||
int index;
|
||||
|
||||
Token(this.type, this.span);
|
||||
|
||||
@override
|
||||
String toString() {
|
||||
|
@ -274,7 +277,7 @@ class StringLiteralToken extends Token {
|
|||
/// sqlite allows binary strings (x'literal') which are interpreted as blobs.
|
||||
final bool binary;
|
||||
|
||||
const StringLiteralToken(this.value, FileSpan span, {this.binary = false})
|
||||
StringLiteralToken(this.value, FileSpan span, {this.binary = false})
|
||||
: super(TokenType.stringLiteral, span);
|
||||
}
|
||||
|
||||
|
@ -295,7 +298,7 @@ class IdentifierToken extends Token {
|
|||
}
|
||||
}
|
||||
|
||||
const IdentifierToken(this.escaped, FileSpan span, {this.synthetic = false})
|
||||
IdentifierToken(this.escaped, FileSpan span, {this.synthetic = false})
|
||||
: super(TokenType.identifier, span);
|
||||
}
|
||||
|
||||
|
|
Loading…
Reference in New Issue