Imprved plugin docs
parent
05b7f28f35
commit
42fc691a00
@ -1,5 +1,15 @@
|
||||
# ApiClient Plugin
|
||||
|
||||
## Installation
|
||||
|
||||
Global installation (recommended, requires SPARK_PLUGINS environment):
|
||||
|
||||
1. Extract the plug into your global plugin directory (`$SPARK_PLUGINS`)
|
||||
2. Go to your project directory and run `spark plugin --enable com.noccy.apiclient`
|
||||
|
||||
Local installation:
|
||||
|
||||
1. Extract the directory into `.spark/plugins`, like a savage.
|
||||
|
||||
## Creating catalogs and requests
|
||||
|
||||
@ -14,8 +24,6 @@ Requests are performed with the `api:request` command:
|
||||
$ spark api:request myapp.ping
|
||||
```
|
||||
|
||||
## Internals
|
||||
|
||||
ApiClient works on a map of properties, populated with the defaults from the
|
||||
catalog. The request properties are then appied, followed by the profile
|
||||
properties.
|
||||
@ -29,6 +37,10 @@ protocol={"http"|"websocket"|"xmlrpc"|"jsonrpc"}
|
||||
# Final URL is [urlbase+]url
|
||||
urlbase={url}
|
||||
url={url}
|
||||
# Control logging
|
||||
log={"yes"|"no"|"auto"}
|
||||
# Log bucket, default is 'default'
|
||||
logfile={name}
|
||||
|
||||
## Authentication options
|
||||
# Username and password, for basic authentication
|
||||
|
@ -1 +1,77 @@
|
||||
# PDO Plugin
|
||||
# PDO Plugin
|
||||
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
Global installation (recommended, requires SPARK_PLUGINS environment):
|
||||
|
||||
1. Extract the plug into your global plugin directory (`$SPARK_PLUGINS`)
|
||||
2. Go to your project directory and run `spark plugin --enable com.noccy.pdo`
|
||||
|
||||
Local installation:
|
||||
|
||||
1. Extract the directory into `.spark/plugins`, like a savage.
|
||||
|
||||
## Configuration
|
||||
|
||||
To define resorces using PHP, add this to the appropriate preloaded file:
|
||||
|
||||
```php
|
||||
create_resource($name, $type, $uri);
|
||||
```
|
||||
|
||||
You can also use the config file `.spark/resources.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"resources": {}
|
||||
}
|
||||
```
|
||||
|
||||
Resources are defined with the name as the key, and the *full* connection URI
|
||||
as the value. The URI should contain the resource type as well as the resource
|
||||
URI separated by a plus sign, in this case `pdo+mysql`, `pdo+sqlite` etc.
|
||||
|
||||
```json
|
||||
...
|
||||
"db": "pdo+mysql://user:pass@host:port/dbname"
|
||||
...
|
||||
```
|
||||
|
||||
You can also create a temporary in-memory SQLite database by using `pdo+sqlite:`
|
||||
|
||||
## Usage
|
||||
|
||||
Storing queries:
|
||||
|
||||
$ spark pdo:store --res otherdb \ # store resource with query
|
||||
"getuserid" \ # Query name
|
||||
"select id from users where username=:username" \ # query
|
||||
:username # slot
|
||||
|
||||
List stored queries:
|
||||
|
||||
$ spark pdo:store
|
||||
|
||||
Delete a stored query:
|
||||
|
||||
$ spark pdo:store --remove getuserid
|
||||
|
||||
Recalling queries:
|
||||
|
||||
$ spark pdo:query --recall getuserid username=bob
|
||||
|
||||
Direct query:
|
||||
|
||||
$ spark pdo:query "select * from users"
|
||||
$ spark pdo:query --res otherdb "select * from users"
|
||||
$ spark pdo:query --vertical "select * from user where id=:id" id=42"
|
||||
$ spark pdo:query --csv "select username from user where id=:id" id=42"
|
||||
$ spark pdo:query --box --vertical "select name,value from config"
|
||||
|
||||
Execute without result:
|
||||
|
||||
$ spark pdo:exec "drop table foo"
|
||||
$ spark pdo:exec --res otherdb "drop table bar"
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user