mirror of https://github.com/AMT-Cheif/drift.git
Merge branch 'develop' into beta
This commit is contained in:
commit
75f432c5f4
|
@ -4,7 +4,9 @@ Currently, we support
|
||||||
|
|
||||||
- showing errors in moor files
|
- showing errors in moor files
|
||||||
- outline
|
- outline
|
||||||
- (kinda) folding
|
- folding
|
||||||
|
- (very, very limited) autocomplete
|
||||||
|
- some quickfixes to make columns nullable or non-null
|
||||||
|
|
||||||
## Setup
|
## Setup
|
||||||
To use this plugin, you'll need to perform these steps once. It is assumed that you
|
To use this plugin, you'll need to perform these steps once. It is assumed that you
|
||||||
|
|
|
@ -1,3 +1,3 @@
|
||||||
## 0.0.1
|
## 0.0.1
|
||||||
|
|
||||||
* TODO: Describe initial release.
|
- Initial release. Contains standalone bindings and a moor implementation.
|
|
@ -1,28 +1,27 @@
|
||||||
# moor_ffi
|
# moor_ffi
|
||||||
|
|
||||||
Experimental bindings to sqlite by using `dart:ffi`. This library contains utils to make
|
Experimental Dart bindings to sqlite by using `dart:ffi`. This library contains utils to make
|
||||||
integration with [moor](https://pub.dev/packages/moor) easier, but it can also be used
|
integration with [moor](https://pub.dev/packages/moor) easier, but it can also be used
|
||||||
as a standalone package.
|
as a standalone package. It also doesn't depend on Flutter, so it can be used on Dart VM
|
||||||
|
applications as well.
|
||||||
|
|
||||||
## Warnings
|
## Warnings
|
||||||
At the moment, `dart:ffi` is in preview and there will be breaking changes that this
|
At the moment, `dart:ffi` is in preview and there will be breaking changes that this
|
||||||
library has to adapt to. This library has been tested on Dart `2.5.0`.
|
library has to adapt to. This library has been tested on Dart `2.5.0`.
|
||||||
|
If you're using a development Dart version (this can include any Flutter channels that
|
||||||
|
are not `stable`), this library might not work.
|
||||||
|
|
||||||
If you're using a development Dart version (this includes Flutter channels that are not
|
Also, please don't use this library on production apps yet.
|
||||||
`stable`), this library might not work.
|
|
||||||
|
|
||||||
If you just want to use moor, using the [moor_flutter](https://pub.dev/packages/moor_flutter)
|
|
||||||
package is the better option at the moment.
|
|
||||||
|
|
||||||
## Supported platforms
|
## Supported platforms
|
||||||
You can make this library work on any platform that let's you obtain a `DynamicLibrary`
|
You can make this library work on any platform that lets you obtain a `DynamicLibrary`
|
||||||
from which moor_ffi loads the functions (see below).
|
in which sqlite's symbols are available (see below).
|
||||||
|
|
||||||
Out of the box, this libraries supports all platforms where `sqlite3` is installed:
|
Out of the box, this library supports all platforms where `sqlite3` is installed:
|
||||||
- iOS: Yes
|
- iOS: Yes
|
||||||
- macOS: Yes
|
- macOS: Yes
|
||||||
- Linux: Available on most distros
|
- Linux: Available on most distros
|
||||||
- Windows: When the user has installed sqlite (they probably have)
|
- Windows: Additional setup is required
|
||||||
- Android: Yes when used with Flutter
|
- Android: Yes when used with Flutter
|
||||||
|
|
||||||
This library works with and without Flutter.
|
This library works with and without Flutter.
|
||||||
|
@ -33,7 +32,7 @@ we need to compile sqlite.
|
||||||
|
|
||||||
### On other platforms
|
### On other platforms
|
||||||
Using this library on platforms that are not supported out of the box is fairly
|
Using this library on platforms that are not supported out of the box is fairly
|
||||||
straightforward. For instance, if you release your own `sqlite3.so` with your application,
|
straightforward. For instance, if you release your own `sqlite3.so` next to your application,
|
||||||
you could use
|
you could use
|
||||||
```dart
|
```dart
|
||||||
import 'dart:ffi';
|
import 'dart:ffi';
|
||||||
|
@ -54,14 +53,14 @@ DynamicLibrary _openOnLinux() {
|
||||||
return DynamicLibrary.open(libraryNextToScript.path);
|
return DynamicLibrary.open(libraryNextToScript.path);
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
Just be sure to first override the behavior and then opening the database. Further,
|
Just be sure to first override the behavior and then open the database. Further,
|
||||||
if you want to use the isolate api, you can only use a static method or top-level
|
if you want to use the isolate api, you can only use a static method or a top-level
|
||||||
function to open the library.
|
function to open the library. For Windows, a similar setup with a `sqlite3.dll` library
|
||||||
|
should work.
|
||||||
|
|
||||||
### Supported datatypes
|
### Supported datatypes
|
||||||
This library supports `null`, `int`, other `num`s (converted to double),
|
This library supports `null`, `int`, `double`, `String` and `Uint8List` to bind args.
|
||||||
`String` and `Uint8List` to bind args. Returned columns from select statements
|
Returned columns from select statements will have the same types.
|
||||||
will have the same types.
|
|
||||||
|
|
||||||
## Using without moor
|
## Using without moor
|
||||||
```dart
|
```dart
|
||||||
|
@ -75,15 +74,12 @@ void main() {
|
||||||
```
|
```
|
||||||
|
|
||||||
You can also use an asynchronous API on a background isolate by using `IsolateDb.openFile`
|
You can also use an asynchronous API on a background isolate by using `IsolateDb.openFile`
|
||||||
or `IsolateDb.openMemory`, respectively. be aware that the asynchronous API is much slower,
|
or `IsolateDb.openMemory`, respectively. Be aware that the asynchronous API is much slower,
|
||||||
but it moves work out of the UI isolate.
|
but it moves work out of the UI isolate.
|
||||||
|
|
||||||
Be sure to __always__ call `Database.close` to avoid memory leaks!
|
Be sure to __always__ call `Database.close` to avoid memory leaks!
|
||||||
|
|
||||||
## Migrating from moor_flutter
|
## Using with moor
|
||||||
__Note__: For production apps, please use `moor_flutter` until this package
|
|
||||||
reaches a stable version.
|
|
||||||
|
|
||||||
Add both `moor` and `moor_ffi` to your pubspec, the `moor_flutter` dependency can be dropped.
|
Add both `moor` and `moor_ffi` to your pubspec, the `moor_flutter` dependency can be dropped.
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
|
@ -100,5 +96,5 @@ In all other project files that use moor apis (e.g. a `Value` class for companio
|
||||||
|
|
||||||
Finally, replace usages of `FlutterQueryExecutor` with `VmDatabase`.
|
Finally, replace usages of `FlutterQueryExecutor` with `VmDatabase`.
|
||||||
|
|
||||||
Note that, at the moment, there is no counterpart for `FlutterQueryExecutor.inDatabasePath` and that the async API using
|
Note that, at the moment, there is no direct counterpart for `FlutterQueryExecutor.inDatabasePath`
|
||||||
a background isolate is not available yet. Both shortcomings with be fixed by the upcoming moor 2.0 release.
|
and that the async API using a background isolate is not available for moor yet.
|
|
@ -11,11 +11,12 @@ class CBlob extends Struct<CBlob> {
|
||||||
int data;
|
int data;
|
||||||
|
|
||||||
static Pointer<CBlob> allocate(Uint8List blob) {
|
static Pointer<CBlob> allocate(Uint8List blob) {
|
||||||
final str = Pointer<CBlob>.allocate(count: blob.length);
|
final str = Pointer<Uint8>.allocate(count: blob.length);
|
||||||
for (var i = 0; i < blob.length; i++) {
|
|
||||||
str.elementAt(i).load<CBlob>().data = blob[i];
|
final asList = str.asExternalTypedData(count: blob.length) as Uint8List;
|
||||||
}
|
asList.setAll(0, blob);
|
||||||
return str;
|
|
||||||
|
return str.cast();
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Allocates a 0-terminated string, encoded as utf8 and read from the
|
/// Allocates a 0-terminated string, encoded as utf8 and read from the
|
||||||
|
@ -30,18 +31,25 @@ class CBlob extends Struct<CBlob> {
|
||||||
/// Reads [bytesToRead] bytes from the current position.
|
/// Reads [bytesToRead] bytes from the current position.
|
||||||
Uint8List read(int bytesToRead) {
|
Uint8List read(int bytesToRead) {
|
||||||
assert(bytesToRead >= 0);
|
assert(bytesToRead >= 0);
|
||||||
final str = addressOf;
|
final str = addressOf.cast<Uint8>();
|
||||||
if (isNullPointer(str)) return null;
|
if (isNullPointer(str)) return null;
|
||||||
|
|
||||||
// todo can we user Pointer.asExternalTypedData here?
|
final data = str.asExternalTypedData(count: bytesToRead) as Uint8List;
|
||||||
final blob = Uint8List(bytesToRead);
|
return Uint8List.fromList(data);
|
||||||
for (var i = 0; i < bytesToRead; ++i) {
|
}
|
||||||
blob[i] = str.elementAt(i).load<CBlob>().data;
|
|
||||||
}
|
/// More efficient version of [readString] that doesn't have to find a nil-
|
||||||
return blob;
|
/// terminator. [length] is the amount of bytes to read. The string will be
|
||||||
|
/// decoded via [utf8].
|
||||||
|
String readAsStringWithLength(int length) {
|
||||||
|
return utf8.decode(read(length));
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Reads a 0-terminated string, decoded with utf8.
|
/// Reads a 0-terminated string, decoded with utf8.
|
||||||
|
///
|
||||||
|
/// Warning: This method is very, very slow. If there is any way to know the
|
||||||
|
/// length of the string to read, [readAsStringWithLength] will be orders of
|
||||||
|
/// magnitude faster.
|
||||||
String readString() {
|
String readString() {
|
||||||
final str = addressOf;
|
final str = addressOf;
|
||||||
if (isNullPointer(str)) return null;
|
if (isNullPointer(str)) return null;
|
||||||
|
|
|
@ -59,10 +59,11 @@ class PreparedStatement implements BasePreparedStatement {
|
||||||
case Types.SQLITE_FLOAT:
|
case Types.SQLITE_FLOAT:
|
||||||
return bindings.sqlite3_column_double(_stmt, index);
|
return bindings.sqlite3_column_double(_stmt, index);
|
||||||
case Types.SQLITE_TEXT:
|
case Types.SQLITE_TEXT:
|
||||||
|
final length = bindings.sqlite3_column_bytes(_stmt, index);
|
||||||
return bindings
|
return bindings
|
||||||
.sqlite3_column_text(_stmt, index)
|
.sqlite3_column_text(_stmt, index)
|
||||||
.load<CBlob>()
|
.load<CBlob>()
|
||||||
.readString();
|
.readAsStringWithLength(length);
|
||||||
case Types.SQLITE_BLOB:
|
case Types.SQLITE_BLOB:
|
||||||
final length = bindings.sqlite3_column_bytes(_stmt, index);
|
final length = bindings.sqlite3_column_bytes(_stmt, index);
|
||||||
return bindings
|
return bindings
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import 'package:test/test.dart';
|
import 'package:test/test.dart';
|
||||||
|
|
||||||
import '../runners.dart';
|
import '../ffi_test.dart';
|
||||||
|
|
||||||
void main(TestedDatabase db) {
|
void main(TestedDatabase db) {
|
||||||
test('insert statements report their id', () async {
|
test('insert statements report their id', () async {
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import 'package:test/test.dart';
|
import 'package:test/test.dart';
|
||||||
|
|
||||||
import '../runners.dart';
|
import '../ffi_test.dart';
|
||||||
|
|
||||||
void main(TestedDatabase db) {
|
void main(TestedDatabase db) {
|
||||||
test('prepared statements can be used multiple times', () async {
|
test('prepared statements can be used multiple times', () async {
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import 'package:test/test.dart';
|
import 'package:test/test.dart';
|
||||||
|
|
||||||
import '../runners.dart';
|
import '../ffi_test.dart';
|
||||||
|
|
||||||
void main(TestedDatabase db) {
|
void main(TestedDatabase db) {
|
||||||
test('select statements return expected value', () async {
|
test('select statements return expected value', () async {
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import 'package:test/test.dart';
|
import 'package:test/test.dart';
|
||||||
|
|
||||||
import '../runners.dart';
|
import '../ffi_test.dart';
|
||||||
|
|
||||||
void main(TestedDatabase db) {
|
void main(TestedDatabase db) {
|
||||||
test('can set the user version on a database', () async {
|
test('can set the user version on a database', () async {
|
||||||
|
|
|
@ -80,9 +80,15 @@ class MoorDriver implements AnalysisDriverGeneric {
|
||||||
}
|
}
|
||||||
final backendTask = _createTask(mostImportantFile.file.uri);
|
final backendTask = _createTask(mostImportantFile.file.uri);
|
||||||
|
|
||||||
final task = session.startTask(backendTask);
|
try {
|
||||||
await task.runTask();
|
final task = session.startTask(backendTask);
|
||||||
_tracker.handleTaskCompleted(task);
|
await task.runTask();
|
||||||
|
_tracker.handleTaskCompleted(task);
|
||||||
|
} catch (e, s) {
|
||||||
|
Logger.root.warning(
|
||||||
|
'Error while working on ${mostImportantFile.file.uri}', e, s);
|
||||||
|
_tracker.removePending(mostImportantFile);
|
||||||
|
}
|
||||||
} finally {
|
} finally {
|
||||||
_isWorking = false;
|
_isWorking = false;
|
||||||
}
|
}
|
||||||
|
@ -141,11 +147,16 @@ class MoorDriver implements AnalysisDriverGeneric {
|
||||||
|
|
||||||
/// Waits for the file at [path] to be parsed.
|
/// Waits for the file at [path] to be parsed.
|
||||||
Future<FoundFile> waitFileParsed(String path) {
|
Future<FoundFile> waitFileParsed(String path) {
|
||||||
_scheduler.notify(this);
|
|
||||||
|
|
||||||
final found = pathToFoundFile(path);
|
final found = pathToFoundFile(path);
|
||||||
|
|
||||||
return completedFiles()
|
if (found.isParsed) {
|
||||||
.firstWhere((file) => file == found && file.isParsed);
|
return Future.value(found);
|
||||||
|
} else {
|
||||||
|
_scheduler.notify(this);
|
||||||
|
final found = pathToFoundFile(path);
|
||||||
|
|
||||||
|
return completedFiles()
|
||||||
|
.firstWhere((file) => file == found && file.isParsed);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -76,6 +76,13 @@ class FileTracker {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Manually remove the [file] from the backlog. As the plugin is still very
|
||||||
|
/// unstable, we use this on unexpected errors so that we just move on to the
|
||||||
|
/// next file if there is a problem.
|
||||||
|
void removePending(TrackedFile file) {
|
||||||
|
_pendingWork.remove(file);
|
||||||
|
}
|
||||||
|
|
||||||
void dispose() {
|
void dispose() {
|
||||||
_computations.close();
|
_computations.close();
|
||||||
}
|
}
|
||||||
|
|
|
@ -37,6 +37,12 @@ abstract class _AssistOnNodeContributor<T extends AstNode> {
|
||||||
const _AssistOnNodeContributor();
|
const _AssistOnNodeContributor();
|
||||||
|
|
||||||
void contribute(AssistCollector collector, T node, String path);
|
void contribute(AssistCollector collector, T node, String path);
|
||||||
|
|
||||||
|
SourceEdit replaceNode(AstNode node, String text) {
|
||||||
|
final start = node.firstPosition;
|
||||||
|
final length = node.lastPosition - start;
|
||||||
|
return SourceEdit(start, length, text);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
class AssistId {
|
class AssistId {
|
||||||
|
|
|
@ -9,9 +9,9 @@ class ColumnNullability extends _AssistOnNodeContributor<ColumnDefinition> {
|
||||||
final notNull = node.findConstraint<NotNull>();
|
final notNull = node.findConstraint<NotNull>();
|
||||||
|
|
||||||
if (notNull == null) {
|
if (notNull == null) {
|
||||||
// there is no not-null constraint on this column, suggest to add one at
|
// there is no not-null constraint on this column, suggest to add one
|
||||||
// the end of the definition
|
// after the type name
|
||||||
final end = node.lastPosition;
|
final end = node.typeNames.last.span.end.offset;
|
||||||
final id = AssistId.makeNotNull;
|
final id = AssistId.makeNotNull;
|
||||||
|
|
||||||
collector.addAssist(PrioritizedSourceChange(
|
collector.addAssist(PrioritizedSourceChange(
|
||||||
|
@ -34,9 +34,7 @@ class ColumnNullability extends _AssistOnNodeContributor<ColumnDefinition> {
|
||||||
collector.addAssist(PrioritizedSourceChange(
|
collector.addAssist(PrioritizedSourceChange(
|
||||||
id.priority,
|
id.priority,
|
||||||
SourceChange('Make this column nullable', id: id.id, edits: [
|
SourceChange('Make this column nullable', id: id.id, edits: [
|
||||||
SourceFileEdit(path, -1, edits: [
|
SourceFileEdit(path, -1, edits: [replaceNode(notNull, '')])
|
||||||
SourceEdit(notNull.firstPosition, notNull.lastPosition, '')
|
|
||||||
])
|
|
||||||
]),
|
]),
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
|
@ -6,6 +6,9 @@ class ColumnDefinition extends AstNode {
|
||||||
final String typeName;
|
final String typeName;
|
||||||
final List<ColumnConstraint> constraints;
|
final List<ColumnConstraint> constraints;
|
||||||
|
|
||||||
|
/// The tokens there were involved in defining the type of this column.
|
||||||
|
List<Token> typeNames;
|
||||||
|
|
||||||
ColumnDefinition(
|
ColumnDefinition(
|
||||||
{@required this.columnName,
|
{@required this.columnName,
|
||||||
@required this.typeName,
|
@required this.typeName,
|
||||||
|
|
|
@ -21,11 +21,19 @@ class AutoCompleteEngine {
|
||||||
final List<Hint> _hints = [];
|
final List<Hint> _hints = [];
|
||||||
UnmodifiableListView<Hint> _hintsView;
|
UnmodifiableListView<Hint> _hintsView;
|
||||||
|
|
||||||
|
final List<Token> _tokens;
|
||||||
|
|
||||||
void addHint(Hint hint) {
|
void addHint(Hint hint) {
|
||||||
_hints.insert(_lastHintBefore(hint.offset), hint);
|
if (_hints.isEmpty) {
|
||||||
|
_hints.add(hint);
|
||||||
|
} else {
|
||||||
|
// ensure that the hints list stays sorted by offset
|
||||||
|
final position = _lastHintBefore(hint.offset);
|
||||||
|
_hints.insert(position + 1, hint);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
AutoCompleteEngine() {
|
AutoCompleteEngine(this._tokens) {
|
||||||
_hintsView = UnmodifiableListView(_hints);
|
_hintsView = UnmodifiableListView(_hints);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -40,13 +48,23 @@ class AutoCompleteEngine {
|
||||||
final hint = foundHints[_lastHintBefore(offset)];
|
final hint = foundHints[_lastHintBefore(offset)];
|
||||||
|
|
||||||
final suggestions = hint.description.suggest(CalculationRequest()).toList();
|
final suggestions = hint.description.suggest(CalculationRequest()).toList();
|
||||||
return ComputedSuggestions(hint.offset, offset - hint.offset, suggestions);
|
|
||||||
|
// when calculating the offset, respect whitespace that comes after the
|
||||||
|
// last hint.
|
||||||
|
final lastToken = hint.before;
|
||||||
|
final nextToken =
|
||||||
|
lastToken != null ? _tokens[lastToken.index + 1] : _tokens.first;
|
||||||
|
|
||||||
|
final hintOffset = nextToken.span.start.offset;
|
||||||
|
final length = offset - hintOffset;
|
||||||
|
|
||||||
|
return ComputedSuggestions(hintOffset, length, suggestions);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// find the last hint that appears before [offset]
|
||||||
int _lastHintBefore(int offset) {
|
int _lastHintBefore(int offset) {
|
||||||
// find the last hint that appears before offset
|
|
||||||
var min = 0;
|
var min = 0;
|
||||||
var max = foundHints.length;
|
var max = foundHints.length - 1;
|
||||||
|
|
||||||
while (min < max) {
|
while (min < max) {
|
||||||
final mid = min + ((max - min) >> 1);
|
final mid = min + ((max - min) >> 1);
|
||||||
|
@ -56,10 +74,18 @@ class AutoCompleteEngine {
|
||||||
|
|
||||||
if (offsetOfMid == offset) {
|
if (offsetOfMid == offset) {
|
||||||
return mid;
|
return mid;
|
||||||
} else if (offsetOfMid < offset) {
|
|
||||||
min = mid + 1;
|
|
||||||
} else {
|
} else {
|
||||||
max = mid - 1;
|
final offsetOfNext = _hints[mid + 1].offset;
|
||||||
|
|
||||||
|
if (offsetOfMid < offset) {
|
||||||
|
if (offsetOfNext > offset) {
|
||||||
|
// next one is too late, so this must be the correct one
|
||||||
|
return mid;
|
||||||
|
}
|
||||||
|
min = mid + 1;
|
||||||
|
} else {
|
||||||
|
max = mid - 1;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -66,8 +66,9 @@ class SqlEngine {
|
||||||
ParseResult parseMoorFile(String content) {
|
ParseResult parseMoorFile(String content) {
|
||||||
assert(useMoorExtensions);
|
assert(useMoorExtensions);
|
||||||
|
|
||||||
final autoComplete = AutoCompleteEngine();
|
|
||||||
final tokens = tokenize(content);
|
final tokens = tokenize(content);
|
||||||
|
final autoComplete = AutoCompleteEngine(tokens);
|
||||||
|
|
||||||
final tokensForParser = tokens.where((t) => !t.invisibleToParser).toList();
|
final tokensForParser = tokens.where((t) => !t.invisibleToParser).toList();
|
||||||
final parser =
|
final parser =
|
||||||
Parser(tokensForParser, useMoor: true, autoComplete: autoComplete);
|
Parser(tokensForParser, useMoor: true, autoComplete: autoComplete);
|
||||||
|
|
|
@ -80,7 +80,14 @@ mixin SchemaParser on ParserBase {
|
||||||
final name = _consume(TokenType.identifier, 'Expected a column name')
|
final name = _consume(TokenType.identifier, 'Expected a column name')
|
||||||
as IdentifierToken;
|
as IdentifierToken;
|
||||||
|
|
||||||
final typeName = _typeName();
|
final typeTokens = _typeName();
|
||||||
|
String typeName;
|
||||||
|
|
||||||
|
if (typeTokens != null) {
|
||||||
|
final typeSpan = typeTokens.first.span.expand(typeTokens.last.span);
|
||||||
|
typeName = typeSpan.text;
|
||||||
|
}
|
||||||
|
|
||||||
final constraints = <ColumnConstraint>[];
|
final constraints = <ColumnConstraint>[];
|
||||||
ColumnConstraint constraint;
|
ColumnConstraint constraint;
|
||||||
while ((constraint = _columnConstraint(orNull: true)) != null) {
|
while ((constraint = _columnConstraint(orNull: true)) != null) {
|
||||||
|
@ -91,19 +98,21 @@ mixin SchemaParser on ParserBase {
|
||||||
columnName: name.identifier,
|
columnName: name.identifier,
|
||||||
typeName: typeName,
|
typeName: typeName,
|
||||||
constraints: constraints,
|
constraints: constraints,
|
||||||
)..setSpan(name, _previous);
|
)
|
||||||
|
..setSpan(name, _previous)
|
||||||
|
..typeNames = typeTokens;
|
||||||
}
|
}
|
||||||
|
|
||||||
String _typeName() {
|
List<Token> _typeName() {
|
||||||
// sqlite doesn't really define what a type name is and has very loose rules
|
// sqlite doesn't really define what a type name is and has very loose rules
|
||||||
// at turning them into a type affinity. We support this pattern:
|
// at turning them into a type affinity. We support this pattern:
|
||||||
// typename = identifier [ "(" { identifier | comma | number_literal } ")" ]
|
// typename = identifier [ "(" { identifier | comma | number_literal } ")" ]
|
||||||
if (!_matchOne(TokenType.identifier)) return null;
|
if (!_matchOne(TokenType.identifier)) return null;
|
||||||
|
|
||||||
final typeNameBuilder = StringBuffer(_previous.lexeme);
|
final typeNames = [_previous];
|
||||||
|
|
||||||
if (_matchOne(TokenType.leftParen)) {
|
if (_matchOne(TokenType.leftParen)) {
|
||||||
typeNameBuilder.write('(');
|
typeNames.add(_previous);
|
||||||
|
|
||||||
const inBrackets = [
|
const inBrackets = [
|
||||||
TokenType.identifier,
|
TokenType.identifier,
|
||||||
|
@ -111,14 +120,15 @@ mixin SchemaParser on ParserBase {
|
||||||
TokenType.numberLiteral
|
TokenType.numberLiteral
|
||||||
];
|
];
|
||||||
while (_match(inBrackets)) {
|
while (_match(inBrackets)) {
|
||||||
typeNameBuilder..write(' ')..write(_previous.lexeme);
|
typeNames.add(_previous);
|
||||||
}
|
}
|
||||||
|
|
||||||
_consume(TokenType.rightParen,
|
_consume(TokenType.rightParen,
|
||||||
'Expected closing paranthesis to finish type name');
|
'Expected closing paranthesis to finish type name');
|
||||||
|
typeNames.add(_previous);
|
||||||
}
|
}
|
||||||
|
|
||||||
return typeNameBuilder.toString();
|
return typeNames;
|
||||||
}
|
}
|
||||||
|
|
||||||
ColumnConstraint _columnConstraint({bool orNull = false}) {
|
ColumnConstraint _columnConstraint({bool orNull = false}) {
|
||||||
|
@ -127,10 +137,13 @@ mixin SchemaParser on ParserBase {
|
||||||
final resolvedName = _constraintNameOrNull();
|
final resolvedName = _constraintNameOrNull();
|
||||||
|
|
||||||
if (_matchOne(TokenType.primary)) {
|
if (_matchOne(TokenType.primary)) {
|
||||||
|
_suggestHint(HintDescription.token(TokenType.key));
|
||||||
_consume(TokenType.key, 'Expected KEY to complete PRIMARY KEY clause');
|
_consume(TokenType.key, 'Expected KEY to complete PRIMARY KEY clause');
|
||||||
|
|
||||||
final mode = _orderingModeOrNull();
|
final mode = _orderingModeOrNull();
|
||||||
final conflict = _conflictClauseOrNull();
|
final conflict = _conflictClauseOrNull();
|
||||||
|
|
||||||
|
_suggestHint(HintDescription.token(TokenType.autoincrement));
|
||||||
final hasAutoInc = _matchOne(TokenType.autoincrement);
|
final hasAutoInc = _matchOne(TokenType.autoincrement);
|
||||||
|
|
||||||
return PrimaryKeyColumn(resolvedName,
|
return PrimaryKeyColumn(resolvedName,
|
||||||
|
@ -138,6 +151,8 @@ mixin SchemaParser on ParserBase {
|
||||||
..setSpan(first, _previous);
|
..setSpan(first, _previous);
|
||||||
}
|
}
|
||||||
if (_matchOne(TokenType.not)) {
|
if (_matchOne(TokenType.not)) {
|
||||||
|
_suggestHint(HintDescription.token(TokenType.$null));
|
||||||
|
|
||||||
final notToken = _previous;
|
final notToken = _previous;
|
||||||
final nullToken =
|
final nullToken =
|
||||||
_consume(TokenType.$null, 'Expected NULL to complete NOT NULL');
|
_consume(TokenType.$null, 'Expected NULL to complete NOT NULL');
|
||||||
|
@ -249,6 +264,7 @@ mixin SchemaParser on ParserBase {
|
||||||
}
|
}
|
||||||
|
|
||||||
ConflictClause _conflictClauseOrNull() {
|
ConflictClause _conflictClauseOrNull() {
|
||||||
|
_suggestHint(HintDescription.token(TokenType.on));
|
||||||
if (_matchOne(TokenType.on)) {
|
if (_matchOne(TokenType.on)) {
|
||||||
_consume(TokenType.conflict,
|
_consume(TokenType.conflict,
|
||||||
'Expected CONFLICT to complete ON CONFLICT clause');
|
'Expected CONFLICT to complete ON CONFLICT clause');
|
||||||
|
@ -260,6 +276,7 @@ mixin SchemaParser on ParserBase {
|
||||||
TokenType.ignore: ConflictClause.ignore,
|
TokenType.ignore: ConflictClause.ignore,
|
||||||
TokenType.replace: ConflictClause.replace,
|
TokenType.replace: ConflictClause.replace,
|
||||||
};
|
};
|
||||||
|
_suggestHint(HintDescription.tokens(modes.keys.toList()));
|
||||||
|
|
||||||
if (_match(modes.keys)) {
|
if (_match(modes.keys)) {
|
||||||
return modes[_previous.type];
|
return modes[_previous.type];
|
||||||
|
@ -284,7 +301,10 @@ mixin SchemaParser on ParserBase {
|
||||||
|
|
||||||
ReferenceAction onDelete, onUpdate;
|
ReferenceAction onDelete, onUpdate;
|
||||||
|
|
||||||
|
_suggestHint(HintDescription.token(TokenType.on));
|
||||||
while (_matchOne(TokenType.on)) {
|
while (_matchOne(TokenType.on)) {
|
||||||
|
_suggestHint(
|
||||||
|
const HintDescription.tokens([TokenType.delete, TokenType.update]));
|
||||||
if (_matchOne(TokenType.delete)) {
|
if (_matchOne(TokenType.delete)) {
|
||||||
onDelete = _referenceAction();
|
onDelete = _referenceAction();
|
||||||
} else if (_matchOne(TokenType.update)) {
|
} else if (_matchOne(TokenType.update)) {
|
||||||
|
|
|
@ -35,6 +35,11 @@ class Scanner {
|
||||||
|
|
||||||
final endSpan = _file.span(source.length);
|
final endSpan = _file.span(source.length);
|
||||||
tokens.add(Token(TokenType.eof, endSpan));
|
tokens.add(Token(TokenType.eof, endSpan));
|
||||||
|
|
||||||
|
for (var i = 0; i < tokens.length; i++) {
|
||||||
|
tokens[i].index = i;
|
||||||
|
}
|
||||||
|
|
||||||
return tokens;
|
return tokens;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -260,7 +260,10 @@ class Token {
|
||||||
final FileSpan span;
|
final FileSpan span;
|
||||||
String get lexeme => span.text;
|
String get lexeme => span.text;
|
||||||
|
|
||||||
const Token(this.type, this.span);
|
/// The index of this [Token] in the list of tokens scanned.
|
||||||
|
int index;
|
||||||
|
|
||||||
|
Token(this.type, this.span);
|
||||||
|
|
||||||
@override
|
@override
|
||||||
String toString() {
|
String toString() {
|
||||||
|
@ -274,7 +277,7 @@ class StringLiteralToken extends Token {
|
||||||
/// sqlite allows binary strings (x'literal') which are interpreted as blobs.
|
/// sqlite allows binary strings (x'literal') which are interpreted as blobs.
|
||||||
final bool binary;
|
final bool binary;
|
||||||
|
|
||||||
const StringLiteralToken(this.value, FileSpan span, {this.binary = false})
|
StringLiteralToken(this.value, FileSpan span, {this.binary = false})
|
||||||
: super(TokenType.stringLiteral, span);
|
: super(TokenType.stringLiteral, span);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -295,7 +298,7 @@ class IdentifierToken extends Token {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const IdentifierToken(this.escaped, FileSpan span, {this.synthetic = false})
|
IdentifierToken(this.escaped, FileSpan span, {this.synthetic = false})
|
||||||
: super(TokenType.identifier, span);
|
: super(TokenType.identifier, span);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue